in PROBABILITY
THE MEASURABILITY OF HITTING TIMES
RICHARD F. BASS1
Department of Mathematics, University of Connecticut, Storrs, CT 06269-3009 email: [email protected]
SubmittedJanuary 18, 2010, accepted in final formMarch 8, 2010 AMS 2000 Subject classification: Primary: 60G07; Secondary: 60G05, 60G40
Keywords: Stopping time, hitting time, progressively measurable, optional, predictable, debut theorem, section theorem
Abstract
Under very general conditions the hitting time of a set by a stochastic process is a stopping time. We give a new simple proof of this fact. The section theorems for optional and predictable sets are easy corollaries of the proof.
1
Introduction
A fundamental theorem in the foundations of stochastic processes is the one that says that, under very general conditions, the first time a stochastic process enters a set is a stopping time. The proof uses capacities, analytic sets, and Choquet’s capacibility theorem, and is considered hard. To the best of our knowledge, no more than a handful of books have an exposition that starts with the definition of capacity and proceeds to the hitting time theorem. (One that does is[1].) The purpose of this paper is to give a short and elementary proof of this theorem. The proof is simple enough that it could easily be included in a first year graduate course in probability. In Section 2 we give a proof of the debut theorem, from which the measurability theorem follows. As easy corollaries we obtain the section theorems for optional and predictable sets. This argument is given in Section 3.
2
The debut theorem
Suppose(Ω,F,P)is a probability space. The outer probabilityP∗associated withPis given by P∗(A) =inf{P(B):A⊂B,B∈ F }.
A set Ais a P-null set ifP(A) =0. Suppose{Ft} is a filtration satisfying the usual conditions: ∩ǫ>0Ft+ǫ=Ft for allt≥0, and eachFt contains everyP-null set. Letπ:[0,∞)×Ω→Ωbe
defined byπ(t,ω) =ω.
1RESEARCH PARTIALLY SUPPORTED BY NSF GRANT DMS-0901505
Recall that a random variable taking values in [0,∞] is a stopping time if(T ≤ t)∈ Ft for all
t; we allow our stopping times to take the value infinity. Since the filtration satisfies the usual conditions, T will be a stopping time if (T < t)∈ Ft for all t. If Ti is a finite collection or
countable collection of stopping times, then supiTiand infiTi are also stopping times.
Given a topological spaceS, the Borelσ-field is the one generated by the open sets. LetB[0,t]
denote the Borelσ-field on[0,t]andB[0,t]× Ftthe productσ-field. A processX taking values in a topological spaceS is progressively measurable if for each t the map(s,ω)→Xs(ω)from
[0,t]×Ω toS is measurable with respect toB[0,t]× Ft, that is, the inverse image of Borel subsets ofS are elements ofB[0,t]× Ft. If the paths ofX are right continuous, thenX is easily seen to be progressively measurable. The same is true ifX has left continuous paths. A subset of
[0,∞)×Ωis progressively measurable if its indicator is a progressively measurable process. IfE⊂[0,∞)×Ω, letDE =inf{t≥0 :(t,ω)∈E}, the debut ofE. We will prove
Theorem 2.1. If E is a progressively measurable set, then DE is a stopping time.
Fixt. LetK0(t)be the collection of subsets of[0,t]×Ωof the formK×C, whereKis a compact subset of[0,t]andC ∈ Ft. LetK(t)be the collection of finite unions of sets inK0(t)and let
Kδ(t)be the collection of countable intersections of sets in K(t). We sayA∈ B[0,t]× Ft is t-approximable if givenǫ >0, there existsB∈ Kδ(t)withB⊂Aand
P∗(π(A))≤P∗(π(B)) +ǫ. (2.1)
Lemma 2.2. If B∈ Kδ(t), thenπ(B)∈ Ft. If Bn∈ Kδ(t)and Bn↓B, thenπ(B) =∩nπ(Bn). The hypothesis that theBnbe inKδ(t)is important. For example, ifBn= [1−(1/n), 1)×Ω, then
π(Bn) = Ωbutπ(∩nBn) =;. This is why the proof given in[2, Lemma 6.18]is incorrect.
Proof. IfB =K×C, whereK is a nonempty subset of[0,t]andC ∈ Ft, thenπ(B) =C∈ Ft. Thereforeπ(B)∈ FtifB∈ K0(t). IfB=∪m
i=1AiwithAi∈ K0(t), thenπ(B) =∪mi=1π(Ai)∈ Ft.
For eachωand each setC, let
S(C)(ω) ={s≤t:(s,ω)∈C}. (2.2) If B ∈ Kδ(t)andBn ↓ B withBn ∈ K(t)for each t, then S(Bn)(ω)↓ S(B)(ω), so S(B)(ω)is
compact.
Now suppose B ∈ Kδ(t)and take Bn ↓ B with Bn ∈ Kδ(t). S(Bn)(ω) is a compact subset of
[0,t]for eachnandS(Bn)(ω)↓S(B)(ω). One possibility is that∩nS(Bn)(ω)6=;; in this case,
ifs∈ ∩nS(Bn)(ω), then(s,ω)∈Bnfor eachn, and so(s,ω)∈B. Thereforeω∈π(Bn)for each
nand ω∈ π(B). The other possibility is that ∩nS(Bn)(ω) =;. Since the sequenceS(Bn)(ω)
is a decreasing sequence of compact sets, S(Bn)(ω) = ;for some n, for otherwise{S(Bn)(ω)c}
would be an open cover of[0,t]with no finite subcover. Thereforeω /∈π(Bn)andω /∈π(B). We
conclude thatπ(B) =∩nπ(Bn).
Finally, supposeB∈ Kδ(t)andBn↓BwithBn∈ K(t). Thenπ(B) =∩nπ(Bn)∈ Ft.
Proposition 2.3. Suppose A is t-approximable. Thenπ(A)∈ Ft. Moreover, givenǫ >0there exists
B∈ Kδ(t)such thatP(π(A)\π(B))< ǫ.
Proof. Choose An∈ Kδ(t)withAn⊂AandP(π(An)) →P∗(π(A)). Let Bn =A1∪ · · · ∪An and
let B= ∪nBn. ThenBn∈ Kδ(t), Bn ↑B, andP(π(Bn))≥P(π(An))→P∗(π(A)). It follows that
π(Bn)↑π(B), and soπ(B)∈ Ft and
For each n, there exists Cn ∈ F such that π(A) ⊂ Cn and P(Cn) ≤ P∗(π(A)) +1/n. Setting
C=∩nCn, we haveπ(A)⊂C andP∗(π(A)) =P(C). Thereforeπ(B)⊂π(A)⊂CandP(π(B)) = P∗(π(A)) =P(C). This implies thatπ(A)\π(B)is aP-null set, and by the completeness assumption,
π(A) = (π(A)\π(B))∪π(B)∈ Ft. Finally, lim
n
P(π(A)\π(B
n)) =P(π(A))\π(B)) =0.
We now prove Theorem 2.1. Proof of Theorem 2.1.Fixt. Let
M={A∈ B[0,t]× Ft:Aist-approximable}.
We showM is a monotone class. Suppose An∈ M,An↑A. Thenπ(An)↑π(A). By Proposition
2.3,π(An)∈ Ft for alln, and thereforeπ(A)∈ Ft andP(π(A)) =limnP(π(An)). Choosenlarge
so thatP(π(A))<P(π(An)) +ǫ/2. Then chooseKn∈ Kδ(t)such that Kn⊂AnandP(π(An))< P(π(Kn)) +ǫ/2. This showsAist-approximable.
Now supposeAn∈ M andAn↓A. ChooseKn∈ Kδ(t)such thatKn⊂AnandP(π(An)\π(Kn))<
ǫ/2n+1. Let Ln = K1∩ · · · ∩Kn, L = ∩nKn. Since each Kn ∈ Kδ(t), so is each Ln, and hence
L ∈ Kδ(t). Also Ln↓ L and L ⊂A. By Lemma 2.2, π(Ln)↓π(L), hence P(π(Ln))→ P(π(L)).
ThereforeP(π(L
n))<P(π(L)) +ǫ/2 ifnis large enough. We write P(π(A))≤P(π(An))≤P(π(An)\π(Ln)) +P(π(Ln))
≤P(∪n
i=1(π(Ai)\π(Ki))) +P(π(Ln)) ≤
n
X
i=1
P(π(A
i)\π(Ki)) +P(π(Ln))
< ǫ+P(π(L))
ifnis large enough. ThereforeAist-approximable.
IfI0(t)is the collection of sets of the form[a,b)×C, wherea< b≤ t andC ∈ F
t, andI(t)
is the collection of finite unions of sets inI0(t), thenI(t)is an algebra of sets. We note that
I(t)generates theσ-fieldB[0,t]× Ft. A set inI0(t)of the form[a,b)×Cis the union of sets inK0(t)of the form[a,b−(1/m)]×C, and it follows that every set inI(t)is the increasing union of sets inK(t). SinceM is a monotone class containingK(t), thenM containsI(t). By the monotone class theorem, M =B[0,t]× Ft. By Proposition 2.3, ifA∈ B[0,t]× Ft, then
π(A)∈ Ft.
Now letEbe a progressively measurable and letA=E∩([0,t]×Ω). We have(DE ≤t) =π(A)∈ Ft. Becausetwas arbitrary,DE is a stopping time.
IfBis a Borel subset of a topological spaceS, let
UB=inf{t≥0 :Xt∈B}
and
TB=inf{t>0 :Xt∈B},
Theorem 2.4. If X is a progressively measurable process taking values inS and B is a Borel subset ofS, then UB and TBare stopping times.
Proof. Since B is a Borel subset ofS and X is progressively measurable, then 1B(Xt) is also progressively measurable. UB is then the debut of the set E = {(s,ω): 1B(Xs(ω)) = 1}, and therefore is a stopping time.
If we letYtδ =Xt+δ andUBδ=inf{t≥0 :Y δ
t ∈B}, then by the above,U δ
B is a stopping time with
respect to the filtration{Fδ
t}, whereFtδ =Ft+δ. It follows thatδ+UBδ is a stopping time with
respect to the filtration{Ft}. Since(1/m) +U1 /m
B ↓TB, thenTBis a stopping time with respect to {Ft}as well.
We remark that in the theory of Markov processes, the notion of completion of aσ-field is a bit different. In that case, we suppose that Ft contains all sets N such thatPµ(N) =0 for every
starting measureµ. The proof in Proposition 2.3 shows that
(Pµ)∗(π(A)\π(B)) =0
for every starting measureµ, soπ(A)\π(B)is aPµ-null set for every starting measureµ. Therefore
π(A) =π(B)∪(π(A)\π(B))∈ Ft. With this modification, the rest of the proof of Theorem 2.1 goes through in the Markov process context.
3
The section theorems
Let (Ω,F,P)be a probability space and let {Ft} be a filtration satisfying the usual conditions. The optionalσ-fieldO is theσ-field of subsets of[0,∞)×Ω generated by the set of mapsX :
[0,∞)×Ω → R whereX is bounded, adapted to the filtration{F
t}, and has right continuous
paths. The predictable σ-field P is the σ-field of subsets of [0,∞)×Ω generated by the set of maps X :[0,∞)×Ω → R whereX is bounded, adapted to the filtration {F
t}, and has left
continuous paths.
Given a stopping time T, we define [T,T] = {(t,ω) : t = T(ω) < ∞}. A stopping time is predictable if there exist stopping times T1,T2, . . . withT1≤ T2≤ · · ·, Tn↑ T, and on the event
(T >0),Tn< T for alln. We say the stopping times TnpredictT. IfT is a predictable stopping
time andS=T a.s., we also callSa predictable stopping time. The optional section theorem is the following.
Theorem 3.1. If E is an optional set andǫ >0, there exists a stopping time T such that[T,T]⊂E andP(π(E))≤P(T<∞) +ǫ.
The statement of the predictable section theorem is very similar.
Theorem 3.2. If E is a predictable set andǫ >0, there exists a predictable stopping time T such that
[T,T]⊂E andP(π(E))≤P(T<∞) +ǫ.
First we prove the following lemma.
Lemma 3.3. (1)O is generated by the collection of processes1C(ω)1[a,b)(t)where C∈ Fa.
Proof. (1) First of all, 1C(ω)1[a,b)(t) is a bounded right continuous adapted process, so it is
optional.
LetO′be theσ-field on[0,∞)×Ωgenerated by the collection of processes 1C(ω)1[a,b)(t), where
C∈ Fa. Lettingb→ ∞,O′includes sets of the form[a,∞)×CwithC∈ Fa.
LetXt be a right continuous, bounded, and adapted process and letǫ >0. LetU0=0 and define Ui+1=inf{t>Ui:|Xt−XUi|> ǫ}. Since(U1<t) =∪(|Xq−X0|> ǫ), where the union is over all rationalqless thant,U1is a stopping time, and an analogous argument shows that eachUiis also a stopping time. IfSandTare stopping times, let 1[S,T)={(t,ω)∈[0,∞)×Ω:S(ω≤t<T(ω)}.
t| ≤ǫ. Therefore we can approximateX by processes of the form
∞ n→ ∞, so is predictable. On the other hand, ifXt is a bounded adapted left continuous process, it can be approximated by
A consequence of this lemma is thatP ⊂ O. SinceO is generated by the class of right continuous processes and right continuous processes are progressively measurable, we have from Theorem 2.1 that the debut of a predictable or optional set is a stopping time.
The proof of the following proposition is almost identical to the proof of Theorem 2.1. Because the debut of optional sets is now known to be a stopping time, it is not nececessary to work with
P∗.
To prove Theorem 3.2 we follow along the same lines. Define
P(t) ={A∩([0,t]×Ω):A∈ P }
be the collection of finite unions of sets inIe0(t). Following the proof of Theorem 3.1, we will be done once we showDBis a predictable stopping time whenB∈Kfδ(t).
stopping time. LetRnmbe stopping times predictingDBnand choosemnlarge so that
P(Rnm
T, except for a set of probability zero. HenceT is a predictable stopping time.
If n > m, then[DBn,DBn] ⊂ Bn ⊂ Bm. SinceS(Bm)(ω)is a closed subset of t, the facts that DBn(ω)∈S(Bm)(ω)forn>mandDBn(ω)→T(ω)for eachωshows thatT(ω)∈S(Bm)(ω)for eachω. Thus[T,T]⊂Bm. This is true for allm, so[T,T]⊂B. In particular,T≥DB, soT =DB. Thereforeπ(B) = (DB<∞) =π([T,T]).
References
[1] C. Dellacherie and P.-A. Meyer.Probabilities and Potential. North-Holland, Amsterdam, 1978. MR0521810