• Tidak ada hasil yang ditemukan

getdoc6da4. 118KB Jun 04 2011 12:04:30 AM

N/A
N/A
Protected

Academic year: 2017

Membagikan "getdoc6da4. 118KB Jun 04 2011 12:04:30 AM"

Copied!
7
0
0

Teks penuh

(1)

in PROBABILITY

THE MEASURABILITY OF HITTING TIMES

RICHARD F. BASS1

Department of Mathematics, University of Connecticut, Storrs, CT 06269-3009 email: [email protected]

SubmittedJanuary 18, 2010, accepted in final formMarch 8, 2010 AMS 2000 Subject classification: Primary: 60G07; Secondary: 60G05, 60G40

Keywords: Stopping time, hitting time, progressively measurable, optional, predictable, debut theorem, section theorem

Abstract

Under very general conditions the hitting time of a set by a stochastic process is a stopping time. We give a new simple proof of this fact. The section theorems for optional and predictable sets are easy corollaries of the proof.

1

Introduction

A fundamental theorem in the foundations of stochastic processes is the one that says that, under very general conditions, the first time a stochastic process enters a set is a stopping time. The proof uses capacities, analytic sets, and Choquet’s capacibility theorem, and is considered hard. To the best of our knowledge, no more than a handful of books have an exposition that starts with the definition of capacity and proceeds to the hitting time theorem. (One that does is[1].) The purpose of this paper is to give a short and elementary proof of this theorem. The proof is simple enough that it could easily be included in a first year graduate course in probability. In Section 2 we give a proof of the debut theorem, from which the measurability theorem follows. As easy corollaries we obtain the section theorems for optional and predictable sets. This argument is given in Section 3.

2

The debut theorem

Suppose(Ω,F,P)is a probability space. The outer probabilityP∗associated withPis given by P∗(A) =inf{P(B):AB,B∈ F }.

A set Ais a P-null set ifP(A) =0. Suppose{Ft} is a filtration satisfying the usual conditions:ǫ>0Ft+ǫ=Ft for allt≥0, and eachFt contains everyP-null set. Letπ:[0,∞)×Ω→Ωbe

defined byπ(t,ω) =ω.

1RESEARCH PARTIALLY SUPPORTED BY NSF GRANT DMS-0901505

(2)

Recall that a random variable taking values in [0,∞] is a stopping time if(Tt)∈ Ft for all

t; we allow our stopping times to take the value infinity. Since the filtration satisfies the usual conditions, T will be a stopping time if (T < t)∈ Ft for all t. If Ti is a finite collection or

countable collection of stopping times, then supiTiand infiTi are also stopping times.

Given a topological spaceS, the Borelσ-field is the one generated by the open sets. LetB[0,t]

denote the Borelσ-field on[0,t]andB[0,t]× Ftthe productσ-field. A processX taking values in a topological spaceS is progressively measurable if for each t the map(s,ω)→Xs(ω)from

[0,t]×Ω toS is measurable with respect toB[0,t]× Ft, that is, the inverse image of Borel subsets ofS are elements ofB[0,t]× Ft. If the paths ofX are right continuous, thenX is easily seen to be progressively measurable. The same is true ifX has left continuous paths. A subset of

[0,∞)×Ωis progressively measurable if its indicator is a progressively measurable process. IfE⊂[0,∞)×Ω, letDE =inf{t≥0 :(t,ω)∈E}, the debut ofE. We will prove

Theorem 2.1. If E is a progressively measurable set, then DE is a stopping time.

Fixt. LetK0(t)be the collection of subsets of[0,t]×of the formK×C, whereKis a compact subset of[0,t]andC ∈ Ft. LetK(t)be the collection of finite unions of sets inK0(t)and let

Kδ(t)be the collection of countable intersections of sets in K(t). We sayA∈ B[0,t]× Ft is t-approximable if givenǫ >0, there existsB∈ Kδ(t)withBAand

P∗(π(A))P∗(π(B)) +ǫ. (2.1)

Lemma 2.2. If B∈ Kδ(t), thenπ(B)∈ Ft. If Bn∈ Kδ(t)and BnB, thenπ(B) =∩nπ(Bn). The hypothesis that theBnbe inKδ(t)is important. For example, ifBn= [1−(1/n), 1)×Ω, then

π(Bn) = Ωbutπ(∩nBn) =;. This is why the proof given in[2, Lemma 6.18]is incorrect.

Proof. IfB =K×C, whereK is a nonempty subset of[0,t]andC ∈ Ft, thenπ(B) =C∈ Ft. Thereforeπ(B)∈ FtifB∈ K0(t). IfB=m

i=1AiwithAi∈ K0(t), thenπ(B) =∪mi=1π(Ai)∈ Ft.

For eachωand each setC, let

S(C)(ω) ={st:(s,ω)∈C}. (2.2) If B ∈ Kδ(t)andBnB withBn ∈ K(t)for each t, then S(Bn)(ω)↓ S(B)(ω), so S(B)(ω)is

compact.

Now suppose B ∈ Kδ(t)and take BnB with Bn ∈ Kδ(t). S(Bn)(ω) is a compact subset of

[0,t]for eachnandS(Bn)(ω)↓S(B)(ω). One possibility is that∩nS(Bn)(ω)6=;; in this case,

ifs∈ ∩nS(Bn)(ω), then(s,ω)∈Bnfor eachn, and so(s,ω)∈B. Thereforeωπ(Bn)for each

nand ωπ(B). The other possibility is that ∩nS(Bn)(ω) =;. Since the sequenceS(Bn)(ω)

is a decreasing sequence of compact sets, S(Bn)(ω) = ;for some n, for otherwise{S(Bn)(ω)c}

would be an open cover of[0,t]with no finite subcover. Thereforeω /π(Bn)andω /π(B). We

conclude thatπ(B) =∩nπ(Bn).

Finally, supposeB∈ Kδ(t)andBnBwithBn∈ K(t). Thenπ(B) =∩(Bn)∈ Ft.

Proposition 2.3. Suppose A is t-approximable. Thenπ(A)∈ Ft. Moreover, givenǫ >0there exists

B∈ Kδ(t)such thatP(π(A)\π(B))< ǫ.

Proof. Choose An∈ Kδ(t)withAnAandP(π(An)) →P∗(π(A)). Let Bn =A1∪ · · · ∪An and

let B= ∪nBn. ThenBn∈ Kδ(t), BnB, andP(π(Bn))≥P(π(An))→P∗(π(A)). It follows that

π(Bn)↑π(B), and soπ(B)∈ Ft and

(3)

For each n, there exists Cn ∈ F such that π(A) ⊂ Cn and P(Cn) ≤ P∗(π(A)) +1/n. Setting

C=∩nCn, we haveπ(A)⊂C andP∗(π(A)) =P(C). Thereforeπ(B)⊂π(A)⊂CandP(π(B)) = P∗(π(A)) =P(C). This implies thatπ(A)\π(B)is aP-null set, and by the completeness assumption,

π(A) = (π(A)\π(B))∪π(B)∈ Ft. Finally, lim

n

P(π(A)\π(B

n)) =P(π(A))\π(B)) =0.

We now prove Theorem 2.1. Proof of Theorem 2.1.Fixt. Let

M={A∈ B[0,t]× Ft:Aist-approximable}.

We showM is a monotone class. Suppose An∈ M,AnA. Thenπ(An)↑π(A). By Proposition

2.3,π(An)∈ Ft for alln, and thereforeπ(A)∈ Ft andP(π(A)) =limnP(π(An)). Choosenlarge

so thatP(π(A))<P(π(An)) +ǫ/2. Then chooseKn∈ Kδ(t)such that KnAnandP(π(An))< P(π(Kn)) +ǫ/2. This showsAist-approximable.

Now supposeAn∈ M andAnA. ChooseKn∈ Kδ(t)such thatKnAnandP(π(An)\π(Kn))<

ǫ/2n+1. Let Ln = K1∩ · · · ∩Kn, L = ∩nKn. Since each Kn ∈ Kδ(t), so is each Ln, and hence

L ∈ Kδ(t). Also LnL and LA. By Lemma 2.2, π(Ln)↓π(L), hence P(π(Ln))→ P(π(L)).

ThereforeP(π(L

n))<P(π(L)) +ǫ/2 ifnis large enough. We write P(π(A))≤P(π(An))≤P(π(An)\π(Ln)) +P(π(Ln))

≤P(n

i=1(π(Ai)\π(Ki))) +P(π(Ln)) ≤

n

X

i=1

P(π(A

i)\π(Ki)) +P(π(Ln))

< ǫ+P(π(L))

ifnis large enough. ThereforeAist-approximable.

IfI0(t)is the collection of sets of the form[a,b)×C, wherea< b t andC ∈ F

t, andI(t)

is the collection of finite unions of sets inI0(t), thenI(t)is an algebra of sets. We note that

I(t)generates theσ-fieldB[0,t]× Ft. A set inI0(t)of the form[a,b)×Cis the union of sets inK0(t)of the form[a,b(1/m)]×C, and it follows that every set inI(t)is the increasing union of sets inK(t). SinceM is a monotone class containingK(t), thenM containsI(t). By the monotone class theorem, M =B[0,t]× Ft. By Proposition 2.3, ifA∈ B[0,t]× Ft, then

π(A)∈ Ft.

Now letEbe a progressively measurable and letA=E∩([0,t]×Ω). We have(DEt) =π(A)∈ Ft. Becausetwas arbitrary,DE is a stopping time.

IfBis a Borel subset of a topological spaceS, let

UB=inf{t≥0 :XtB}

and

TB=inf{t>0 :XtB},

(4)

Theorem 2.4. If X is a progressively measurable process taking values inS and B is a Borel subset ofS, then UB and TBare stopping times.

Proof. Since B is a Borel subset ofS and X is progressively measurable, then 1B(Xt) is also progressively measurable. UB is then the debut of the set E = {(s,ω): 1B(Xs(ω)) = 1}, and therefore is a stopping time.

If we letYtδ =Xt+δ andUBδ=inf{t≥0 :Y δ

tB}, then by the above,U δ

B is a stopping time with

respect to the filtration{Fδ

t}, whereF =Ft+δ. It follows thatδ+UBδ is a stopping time with

respect to the filtration{Ft}. Since(1/m) +U1 /m

BTB, thenTBis a stopping time with respect to {Ft}as well.

We remark that in the theory of Markov processes, the notion of completion of aσ-field is a bit different. In that case, we suppose that Ft contains all sets N such thatPµ(N) =0 for every

starting measureµ. The proof in Proposition 2.3 shows that

(Pµ)∗(π(A)\π(B)) =0

for every starting measureµ, soπ(A)\π(B)is aPµ-null set for every starting measureµ. Therefore

π(A) =π(B)∪(π(A)\π(B))∈ Ft. With this modification, the rest of the proof of Theorem 2.1 goes through in the Markov process context.

3

The section theorems

Let (Ω,F,P)be a probability space and let {Ft} be a filtration satisfying the usual conditions. The optionalσ-fieldO is theσ-field of subsets of[0,∞)×Ω generated by the set of mapsX :

[0,∞)×Ω → R whereX is bounded, adapted to the filtration{F

t}, and has right continuous

paths. The predictable σ-field P is the σ-field of subsets of [0,∞)×Ω generated by the set of maps X :[0,∞)×Ω → R whereX is bounded, adapted to the filtration {F

t}, and has left

continuous paths.

Given a stopping time T, we define [T,T] = {(t,ω) : t = T(ω) < ∞}. A stopping time is predictable if there exist stopping times T1,T2, . . . withT1≤ T2≤ · · ·, TnT, and on the event

(T >0),Tn< T for alln. We say the stopping times TnpredictT. IfT is a predictable stopping

time andS=T a.s., we also callSa predictable stopping time. The optional section theorem is the following.

Theorem 3.1. If E is an optional set andǫ >0, there exists a stopping time T such that[T,T]⊂E andP(π(E))P(T<) +ǫ.

The statement of the predictable section theorem is very similar.

Theorem 3.2. If E is a predictable set andǫ >0, there exists a predictable stopping time T such that

[T,T]⊂E andP(π(E))P(T<) +ǫ.

First we prove the following lemma.

Lemma 3.3. (1)O is generated by the collection of processes1C(ω)1[a,b)(t)where C∈ Fa.

(5)

Proof. (1) First of all, 1C(ω)1[a,b)(t) is a bounded right continuous adapted process, so it is

optional.

LetO′be theσ-field on[0,∞)×Ωgenerated by the collection of processes 1C(ω)1[a,b)(t), where

C∈ Fa. Lettingb→ ∞,O′includes sets of the form[a,∞)×CwithC∈ Fa.

LetXt be a right continuous, bounded, and adapted process and letǫ >0. LetU0=0 and define Ui+1=inf{t>Ui:|XtXUi|> ǫ}. Since(U1<t) =∪(|XqX0|> ǫ), where the union is over all rationalqless thant,U1is a stopping time, and an analogous argument shows that eachUiis also a stopping time. IfSandTare stopping times, let 1[S,T)={(t,ω)∈[0,∞)×Ω:S(ωt<T(ω)}.

t| ≤ǫ. Therefore we can approximateX by processes of the form

n→ ∞, so is predictable. On the other hand, ifXt is a bounded adapted left continuous process, it can be approximated by

A consequence of this lemma is thatP ⊂ O. SinceO is generated by the class of right continuous processes and right continuous processes are progressively measurable, we have from Theorem 2.1 that the debut of a predictable or optional set is a stopping time.

(6)

The proof of the following proposition is almost identical to the proof of Theorem 2.1. Because the debut of optional sets is now known to be a stopping time, it is not nececessary to work with

P∗.

To prove Theorem 3.2 we follow along the same lines. Define

P(t) ={A∩([0,t]×Ω):A∈ P }

be the collection of finite unions of sets inIe0(t). Following the proof of Theorem 3.1, we will be done once we showDBis a predictable stopping time whenB∈Kfδ(t).

stopping time. LetRnmbe stopping times predictingDBnand choosemnlarge so that

P(Rnm

T, except for a set of probability zero. HenceT is a predictable stopping time.

If n > m, then[DBn,DBn] ⊂ BnBm. SinceS(Bm)(ω)is a closed subset of t, the facts that DBn(ω)∈S(Bm)(ω)forn>mandDBn(ω)→T(ω)for eachωshows thatT(ω)∈S(Bm)(ω)for eachω. Thus[T,T]⊂Bm. This is true for allm, so[T,T]⊂B. In particular,TDB, soT =DB. Thereforeπ(B) = (DB<∞) =π([T,T]).

(7)

References

[1] C. Dellacherie and P.-A. Meyer.Probabilities and Potential. North-Holland, Amsterdam, 1978. MR0521810

Referensi

Dokumen terkait

Several classical results on spanning trees, including Wilson’s algorithm, follow easily, as well as a method to construct random Hamiltonian cycles..

The main objective in this section is to prove the following theorem concerning the conditional distribution of the number of edges given the colouring..

In Section 3, we also discuss this result in the light of the literature on optimal stopping and American options pricing in frictionless markets.. Section 4 presents the extension

Theorem 1 For any d ≥ 1 , there exists a class A of measurable subsets of R d of vc dimension equal to one such that absconv( A ) is rich with respect to all probability measures

The main lemma in the proof of Theorem 1.1 involves estimating the resistances. A combination of the above identities will then yield lower bounds for the hitting times. We then

In Section 6 we prove Theorem 3.2 about the convergence of the renormalization branching process to a time-homogeneous limit.. In Section 7 , we prove the statements from

Using Talagrand’s abstract concentration inequality in product spaces and the related kernel method for empirical processes [14] we will first prove a general result that

Keywords: Total variation distance; Non-central limit theorem; Fractional Brownian motion; Hermite power variation; Multiple stochastic integrals; Hermite random