• Tidak ada hasil yang ditemukan

Filtrations

Dalam dokumen Graduate Texts in Mathematics 261 (Halaman 94-97)

Continuing with the heuristics, suppose that we are interested in a random experiment taking place over an infinite expanse of time. Let T = R+ or T =Nbe the time set. For each time t, letFt be the information gathered during [0, t] by an observer of the experiment. For s < t, we must have FsFt. The familyF={Ft: t∈T}, then, depicts the flow of information as the experiment progresses over time. The following definition formalizes this concept.

4.8 Definition. LetT be a subset ofR. For eachtinT, letFtbe a sub- σ-algebra ofH. The family F={Ft: t∈T} is called a filtration provided thatFsFt fors < t.

In other words, a filtration is an increasing family of sub-σ-algebras ofH. The simplest examples are the filtrations generated by stochastic processes:

IfX ={Xt: t∈T}is a stochastic process, then puttingFt=σ{Xs: s≤t, s∈T}yields a filtrationF={Ft: t∈T}. The reader is invited to ponder the meaning of the next proposition for such a filtration. Of course, the aim is to approximate eternal variables by random variables that become known in finite time.

4.9 Proposition. Let F={Fn : n∈ N} be a filtration and put F =

nNFn. For each bounded random variable V in F there are bounded variables Vn inFn,n∈N, such that

limn E|Vn−V| = 0.

Remark. Note thatE|Vn−V|=Vn−V1in the notation of section 3;

thus, the approximation here is in the sense ofL1-space. Also, we may add to the conclusion that EVn EV; this follows from the observation that

|EVnEV| ≤E|Vn−V|. Proof. LetC=

nFn. By definition,F=σC. ObviouslyCis a p-system.

To complete the proof via the monotone class theorem, we start by lettingMb

be the collection of all bounded variables inF having the approximation property described. It is easy to see thatMbincludes constants and is a vector space overRand includes the indicators of events inC. Thus,Mbwill include all boundedV in Fonce we check the remaining monotonicity condition.

Let (Uk)Mbbe positive and increasing to a bounded variableV inF. Then, for eachk≥1 there areUk,ninFn,n∈N, such thatE|Uk,n−Uk| →0 as n → ∞. Put n0 = 0, and for eachk 1 choose nk > nk1 such that Uˆk=Uk,nk satisfies

E|Uˆk−Uk|< 1 k.

Moreover, since (Uk) is bounded and converges to V, the bounded conver- gence implies thatE|Uk−V| →0. Hence,

E|Uˆk−V| ≤ E|Uˆk−Uk| + E|Uk−V| → 0 4.10

ask→ ∞. Withn0= 0 chooseV0= 0 and putVn= ˆUk for all integersnin (nk, nk+1]; then, Vn Fnk Fn, andE|Vn−V| →0 as n→ ∞ in view of 4.10. This is what we need to show thatV Mb. In the preceding proposition, theVnare shown to exist but are unspecified.

A very specific version will appear later employing totally new tools; see the martingale convergence theorems of ChapterVand, in particular, Corollary V.3.30 there.

Exercises and complements

4.11 p-systems forσX. Let T be an arbitrary index set. LetX = (Xt)tT, where Xt takes values in (Et,Et) for each t in T. For each t, let Ct be a p-system that generatesEt. LetG0 be the collection of allG⊂Ω having the form

G=

tS

{Xt∈At}

for some finiteS⊂TandAtinCtfor everytinS. Show thatG0is a p-system that generatesG=σX.

4.12 Monotone class theorem.This is a generalization of the monotone class theorem I.2.19. We keep the setting and notations of the preceding exercise.

Sec. 4 Information and Determinability 81 Let M be a monotone class of mappings from Ω into ¯R. Suppose that M includes everyV : Ω[0,1] having the form

V =

tS

1At◦Xt, S finite, AtCt for everytin S.

Then, every positiveV inσX belongs toM. Prove.

4.13 Special case.In the setting of the exercises above, supposeEt=Rand Et =BR for all t. Let M be a monotone class of mappings from Ω into ¯R. Suppose thatMincludes everyV of the form

V =f1◦Xt1· · ·fn◦Xtn

withn≥1 andt1, . . . , tn in T andf1, . . . , fn bounded continuous functions from R into R. Then, M contains all positive V in σX. Prove. Hint: Start by showing that, if A is an open interval of R, then 1A is the limit of an increasing sequence of bounded continuous functions.

4.14 Determinability. If X and Y are random variables taking values in (E,E) and (D,D), then we say thatX determines Y ifY =f◦X for some f : E→D measurable with respect toEandD. Then,σX ⊃σY obviously.

Heuristically,XdeterminesY if knowingX(ω) is sufficient for knowingY(ω), this being true for every possibility ω. To illustrate the notion in a simple setting, let T be a positive random variable and define a stochastic process X= (Xt)tR+ by setting, for eachω

Xt(ω) =

0 ift < T(ω), 1 ift≥T(ω).

Show thatX andT determine each other. IfT represents the time of failure for a device, then X is the process that indicates whether the device has failed or not. ThatX andT determine each other is intuitively obvious, but the measurability issues cannot be ignored altogether.

4.15 Warning. A slight change in the preceding exercise shows that one must guard against raw intuition. LetT have a distribution that is absolutely continuous with respect to the Lebesgue measure onR+; in fact, all we need is thatP{T =t}= 0 for everytin R+. Define

Xt(ω) =

1 ift=T(ω) 0 otherwise.

Show that, for each t in R+, the random variable Xt is determined by T. But, contrary to raw intuition,T is not determined byX= (Xt)tR+. Show this by following the steps below:

a) For each t, we have Xt = 0 almost surely. Therefore, for every sequence (tn) inR+,Xt1 =Xt2 =. . .= 0 almost surely.

b) IfV ∈σX, thenV =calmost surely for some constantc. It follows thatT is not inσX.

4.16 Arrival processes. Let T = (T1, T2, . . .) be an increasing sequence of R+-valued variables. Define a stochastic process X = (Xt)tR+ with state spaceNby

Xt= n=1

1(0,t]◦Tn, t∈R+.

Show that X andT determine each other. IfTn represents then-th arrival time at a store, thenXtis the number of customers who arrived during (0, t].

So,X andT are the same phenomena viewed from different angles.

5 Independence

This section is about independence, a truly probabilistic concept. For random variables, the concept reduces to the earlier definition: they are inde- pendent if and only if their joint distribution is the product of their marginal distributions.

Throughout, (Ω,H,P) is a probability space. As usual, if G is a sub-σ- algebra ofH, we regard it both as a collection of events and as the collection of all numerical random variables that are measurable with respect to it.

Recall thatσX is theσ-algebra on Ω generated byX, andX here can be a random variable or a collection of random variables. Finally, we writeFI for

iIFi as in I.1.8and refer to it as theσ-algebra generated by the collection ofσ-algebrasFi,i∈I.

Dalam dokumen Graduate Texts in Mathematics 261 (Halaman 94-97)