• Tidak ada hasil yang ditemukan

Lecture-02: Probability Review

N/A
N/A
Protected

Academic year: 2025

Membagikan "Lecture-02: Probability Review"

Copied!
4
0
0

Teks penuh

(1)

Lecture-02: Probability Review

1 Probability Review

Definition 1.1 (σ-algebra). A collectionAof subsets of a setΩis called aσ-algebra if (i) it contains the empty set, (ii) it is closed under complements, and (iii) it is closed under countable unions.

Definition 1.2. Aprobability space(Ω,F,P)consists of

(i) set of all possible outcomes called asample spacedenoted byΩ, (ii) aσ-algebra over sample space calledevent spacedenoted byF, and

(iii) a non-negative set functionprobabilitydenoted byP:F→[0, 1], such thatP() =1, and is addi- tive for countably disjoint events.

An element of the sample space is called an outcome and an element of event space is called an event.

Definition 1.3. A collection of eventsE⊆Fis called a sub-event space if it is aσ-algebra overΩ.

Definition 1.4. For a family of events A ⊆F, the sub-event space generated by the family A is the smallest sub-event space containing the familyAand denoted byσ(A).

Remark1. The sub-event spaceσ(A)contains all the elements ofAand the complements and countable unions of generated sets.

Example 1.5 (Discreteσ-algebra). For a finite sample space Ω, the event spaceF={A:A⊆} consists of all subsets of sample spaceΩ, and is denoted by 2orP().

Example 1.6 (Borelσ-algebra). If the sample spaceΩ=R, then aBorelσ-algebra is generated by half-open intervals by complements and countable unions. That is,B(R)≜σ({(−∞,x]:x∈R}). We make the following observations.

1. From closure under complements, the open interval(x,∞)belong toB(R)for eachx∈R.

2. From closure under countable unions, the open interval(−∞,x) =∪n∈N(−∞,x− 1n]belongs toB(R)for eachx∈R.

3. From closure under complements, half-closed intervals[x,∞)belongs toB(R)for eachx∈R.

4. From closure under finite intersections, the closed set[x,y] = [x,∞)∩(−∞,y]belongs toB(R) for eachx,y∈R.

5. From closure under countable intersections, the singleton{x}=Tn∈N([x−1n,x+1n])belongs toB(R)for eachx∈R.

1.1 Limits of sets and continuity of probability

There is a natural order of inclusion on sets through which we can define monotonicity of probability set functionP. To define continuity of this set function, we define limits of sets.

1

(2)

Definition 1.7. For a sequence of sets(An:n∈N), we definelimit superiorandlimit inferiorof this set sequence respectively as

lim sup

n

An= \

n∈N [

k⩾n

Ak, lim inf

n An= [

n∈N

\

k⩾n

Ak.

It is easy to check that lim infAn⊆lim supAn. We say that limit of set sequence exists if lim supAn⊆ lim infAn, and the limit of the set sequence in this case is lim supAn.

Theorem 1.8. Probability set function is monotone and continuous.

Proof. Consider two events A⊆B both elements ofF, then from the additivity of probability over disjoint eventsAandB\A, we have

P(B) =P(A∪B\A) =P(A) +P(B\A)⩾P(A).

Monotonicity follows from non-negativity of probability set function, that is sinceP(B\A)>0. For continuity from below, we take an increasing sequence of sets(An:n∈N), such thatAn ⊆An+1for alln. We observe thatA≜limnAn=∪n∈NAnandAn↑A. We can define disjoint sets(En:n∈N), where

E1=A1, En=An\An−1,n⩾2.

The disjoint setsEn’s satisfy∪ni=1Ei=Anfor alln∈Nand∪nEn=∪nAn. From the above property and the additivity of probability set function over disjoint sets, it follows that

P(A) =P(∪nEn) =

n∈N

P(En) =lim

n∈N

n i=1

P(Ei) =lim

n∈NP(∪ni=1Ei) =lim

n∈NP(An).

For continuity from below, we take decreasing sequence of sets (An :n∈N), such that An+1⊆ An

for alln. We can form increasing sequence of sets(Bn:n∈N)whereBn=Acn. Then, the continuity from below follows from the continuity from above. Continuity of probability for general sequence of converging sets follows from the definition of limsup and liminf of sequence of sets and the continuity of probability function from above and below.

1.2 Independence

Definition 1.9. For a probability space(Ω,F,P), two eventsA,B∈Fareindependent eventsif P(A∩B) =P(A)P(B).

Definition 1.10. Two sub-event spacesGandHare called independent if any pair of events(G,H)∈ G×Hare independent. That is,

P(G∩H) =P(G)P(H),G∈G,H∈H.

1.3 Conditional Probability

Definition 1.11. Let(Ω,F,P)be a probability space. For events A,B∈Fsuch that 1>P(B)>0, the conditional probability of eventAgiven eventBis defined as

P(A|B) = P(A∩B) P(B) .

2 Random variables

Definition 2.1. A real valuedrandom variableXon a probability space(Ω,F,P)is a functionX:Ω→R such that for everyx∈R, we have

X−1(−∞,x]≜{ω∈Ω:X(ω)⩽x} ∈F.

Recall that the collection{(−∞,x]:x∈R}generates the Borelσ-algebraB(R). Therefore, it follows thatX−1(B(R))⊆F, since set inverse mapX−1preserves complements, unions, and intersections.

2

(3)

Definition 2.2. For a random variableXdefined on the probability space(Ω,F,P), we defineσ(X)is the smallestσ-algebra formed by inverse mapping of Borel sets, i.e.

σ(X)≜σ(nX−1(−∞,x]:x∈Ro).

Remark2. We note thatσ(X)is a sub-event space ofFand hence probability is defined for each element ofσ(X).

Definition 2.3. For a random variableXdefined on probability space(Ω,F,P), the correspondingdis- tribution functionF:R→[0, 1]is defined as

F(x)≜(P◦X−1)(−∞,x], for allx∈R.

Theorem 2.4. Distribution function F of a random variable X:Ω→Ris non-negative, monotone increasing, continuous from the right, and has countable points of discontinuities. Further, if P◦X−1(R) =1, then

x→−limF(x) =0, lim

x→F(x) =1.

Proof. Non-negativity and monotonicity of distribution function follows from non-negativity and mono- tonicity of probability set function, and the fact that forx1<x2

X−1(−∞,x1]⊆X−1(−∞,x2].

Letxn↓xbe a decreasing sequence of real numbers. We take decreasing sets A∈FN, where An≜ X−1(−∞,xn]∈Fforn∈N. Then An↓A, and the right continuity of distribution function follows from the continuity from above of probability set functions. Countable discontinuities follow from the fact that limx→FX(x)⩽1. By taking sequences an↓ −and bn↑∞, we define sequence of mono- tone sets An≜X−1(−∞,an]↓andBn≜X−1(−∞,bn]↑Ω. The result follow from the continuity of probability function.

Example 2.5. One of the simplest family of random variables are indicator functions1:F×→ {0, 1}. For each eventA∈F, we can define an indicator function as

1A(ω) =

(1, ω∈A, 0, ω∈/A.

We make the following observations.

1. 1Ais a random variable for eachA∈F. This follows from the fact that

1−1A (−∞,x] =





∅, x<0, Ac, x∈[0, 1), Ω, x⩾1.

2. The distribution functionFfor the random variable1Ais given by

F(x) =





0, x<0, P(Ac), x∈[0, 1),

1, x⩾1.

Definition 2.6. A random variableXisindependentof an event subspaceE, ifσ(X)andEare inde- pendent event subspaces.

Remark3. Sinceσ(X)is generated by the collection(X−1(−∞,x]:x∈R), it follows thatXis indepen- dent ofEif and any if for allx∈Rand eventE∈E,

E[1{X⩽x}1E] =P({X⩽x} ∩E) =P({X⩽x})P(E) =E1{X⩽x}E1E. 3

(4)

2.1 Expectation

Letg:RRbe a Borel measurable function, i.e. g−1(−∞,x]∈B(R)for allx∈R. Then, theexpectation ofg(X)for a random variableXwith distribution functionFis defined as

Eg(X) = Z

x∈Rg(x)dF(x).

Remark4. Recall that probabilities are defined only for events. For a random variableX, the probabilities are defined for generating eventsX−1(−∞,x]∈FbyF(x) =P◦X−1(−∞,x].

Remark5. The expectation is only defined for random variables. For an eventA, the probabilityP(A) equals expectation of the indicator random variable1A.

3 Random Vectors

Definition 3.1. IfX1, . . . ,Xnare random variables defined on the same probability space(Ω,F,P), then the vectorX≜(X1, . . . ,Xn)is a random mappingΩ→Rnand is called a random vector. Since eachXi is a random variable, the joint event∩i∈[n]Xi−1(−∞,xi]∈F, and the joint distribution of random vector Xis defined as

FX(x1, . . . ,xn) =P

i∈[n]Xi−1(−∞,xi], for allx∈Rn.

Definition 3.2. Two random variables X,Yare independent if σ(X)andσ(Y)are independent event subspaces.

Remark6. Sinceσ(X)andσ(Y)are generated by collections(X−1(−∞,x]:x∈R)and(Y−1(−∞,y]:y∈ R), it follows that the random variablesX,Yare independent if and only if for allx,y∈R, we have

FX,Y(x,y) =FX(x)FY(y).

Definition 3.3. A random vector X:Ω→Rn is independent if the joint distribution is product of marginals. That is,

FX(x) =

n i=1

FXi(xi), for allx∈Rn.

4

Referensi

Dokumen terkait

In order to give a general approach to weak forms of continuity and a general setting for decomposition of continuity, Tong [13] introduced the concepts of expansion on open

distribution of the random variables. Analogous to the probability density function of a single continuous random variable, a joint probability density function can be dei ned

The second part is the study of probability, which begins at the basics of sets and the equally likely model, journeys past discrete / continuous random variables, and continues

From the analysis, we come up with the most general definition of scientific consciousness education as follows: “Scientific consciousness education is the

Population mean unknown Data observations known µ Sample mean known = x µˆ Probability density function unknown Point Estimates of Parameters A point estimate of unknown parameter

If the joint probability density function of continuous random variables Xand Yis fXYx, y, the marginal probability density functionsof Xand Yare 5-16 where Rx denotes the set of all

Consider a sequence Xn:n∈Nof random variables and a random variable X, such thatlimnXn=X in probability, thenlimnXn=X in distribution... From the above set relations, law of total

SEOUL NATIONAL UNIVERSITY School of Mechanical & Aerospace Engineering 446.358 Engineering Probability 8 Joint Probability Distribution of Function of Random Variables X1, X2: