• Tidak ada hasil yang ditemukan

Lecture-18: Random Processes

N/A
N/A
Protected

Academic year: 2025

Membagikan "Lecture-18: Random Processes"

Copied!
3
0
0

Teks penuh

(1)

Lecture-18: Random Processes

1 Introduction

Definition 1.1 (Random process). Let (Ω,F,P)be a probability space. For an arbitrary index set T and state spaceX⊆R, arandom processis a measurable mapX:Ω→XT. That is, for each outcomeωΩ, we have a functionX(ω):T7→Xcalled thesample pathor thesample functionof the processX, also written as

X(ω),(Xt(ω)∈X:t∈T).

1.1 Classification

State spaceXcan be countable or uncountable, corresponding to discrete or continuous valued process.

If the index setT is countable, the stochastic process is calleddiscrete-time stochastic process or random sequence. When the index setTis uncountable, it is calledcontinuous-time stochastic process. The index set Tdoesn’t have to be time, if the index set is space, and then the stochastic process is spatial process.

WhenT=Rn×[0,∞), stochastic processXis a spatio-temporal process.

Example 1.2. We list some examples of each such stochastic process.

i Discrete random sequence: brand switching, discrete time queues, number of people at bank each day.

ii Continuous random sequence: stock prices, currency exchange rates, waiting time in queue ofnth arrival, workload at arrivals in time sharing computer systems.

iii Discrete random process: counting processes, population sampled at birth-death instants, number of people in queues.

iv Continuous random process: water level in a dam, waiting time till service in a queue, location of a mobile node in a network.

1.2 Measurability

For any finite subsetS⊆Tand real vectorx∈RT such thatxt=for anyt∈/S, we define a set A(x) = y∈RT:yt6xt =

×

t∈T(−∞,xt]. Then, the measurability of the random processXimplies that for any such setA(x), we have

{ω:Xt(ω)6xt,t∈T}=∩t∈TX−1s (−∞,xt] =X−1

×

t∈T

(−∞,xt] =X−1(A(x))∈F.

Remark 1. Realization of random process at each t∈T, is a random variable defined on the probability space(Ω,F,P)such thatXt:Ω→X. This follows from the fact that for anyt∈TandxtR, we can take Boreal measurable sets

×

(−∞,xt]

×

s6=tR. Then,X−1(A(y)) =X−1t (−∞,xt]∈F.

Remark2. The random processXcan be thought of as a collection of random variablesX= (XtX:t∈T) or an ensemble of sample pathsX= (X(ω)∈XT:ω). Recall thatXT is set of all functions from the index setTto state spaceX.

1

(2)

Example 1.3 (Bernoulli sequence). Let index setT=N={1, 2, . . .}and the sample space be the col- lection of infinite bi-variate sequences of successes (S) and failures (F) defined by Ω={S,F}N. An outcomeωis an infinite sequenceω= (ω1,ω2, . . .)such thatωn∈ {S,F}for eachn∈N. We define the random processX:Ω→ {0, 1}Nsuch thatX(ω) = (1{S}(ω1),1{S}(ω2), . . .). That is, we have

Xn(ω) =1{S}(ωn), X(ω) = (1{S}(ωn):n∈N).

Hence, we can write the process as collection of random variables X= (Xn ∈ {0, 1}:n∈N)or the collection of sample pathsX= (X(ω)∈ {0, 1}N:ω).

1.3 Distribution

To define a measure on a random process, we can either put a measure on sample paths, or equip the collection of random variables with a joint measure. We are interested in identifying the joint distribution F:RT→[0, 1]. To this end, for anyx∈RTwe need to know

FX(x),P \

t∈T

{ω∈Ω:Xt(ω)6xt}

!

=P(\

t∈T

X−1t (−∞,xt]) =P◦X−1

×

t∈T

(−∞,xt].

However, even for a simple independent process with countably infiniteT, any function of the above form would be zero if xtis finite for all t∈T. Therefore, we only look at the values ofF(x)whenxtRfor indicestin a finite setSandxt=for allt∈/S. That is, for any finite setS⊆T, we focus on the product sets of the form

A(x),

×

s∈S

(−∞,xs]

×

s/∈S

R,

wherex∈XT andxt=∞fort∈/S. Recall that by definition of measurability,X−1(A(x))∈F, and hence P◦X−1A(x)is well defined.

Definition 1.4 (Finite dimensional distribution). We can define afinite dimensional distributionfor any finite setS⊆TandxS={xsR:s∈S},

FXS(xS),P \

s∈S

{ω∈Ω:Xs(ω)6xs}

!

=P(\

s∈S

X−1s (−∞,xs]).

Set of all finite dimensional distributions of the stochastic processX= (XtX:t∈T)characterizes its distribution completely. Simpler characterizations of a stochastic processXare in terms of its moments.

That is, the first moment such as mean, and the second moment such as correlations and covariance func- tions.

mX(t),EXt, RX(t,s),EXtXs, CX(t,s),E(Xt−mX(t))(Xs−mX(s)).

Example 1.5. Some examples of simple stochastic processes.

i Xt=Acos 2πt, whereAis random. The finite dimensional distribution is given by FXS(x) =P({Acos 2πs≤xs,s∈S}).

The moments are given by

mX(t) = (EA)cos 2πt, RX(t,s) = (EA2)cos 2πtcos 2πs, CX(t,s) =Var(A)cos 2πtcos 2πs.

2

(3)

ii Xt=cos(2πt+Θ), whereΘis random and uniformly distributed between (−π,π]. The finite dimensional distribution is given by

FXS(x) =P({cos(2πs+Θ)≤xs,s∈S}). The moments are given by

mX=0, RX(t,s) =1

2cos 2π(t−s), CX(t,s) =RX(t,s). iii Xn=Unforn∈N, whereUis uniformly distributed in the open interval(0, 1). iv Zt=At+BwhereAandBare independent random variables.

1.4 Independence

Recall, given the probability space(Ω,F,P), two eventsA,B∈Fareindependent eventsif P(A∩B) =P(A)P(B).

Random variablesX,Ydefined on the above probability space, areindependent random variablesif for allx,y∈R

P{X(ω)6x,Y(ω)6y}=P{X(ω)6x}P{Y(ω)6y}.

A stochastic processXis said to beindependentif for all finite subsetsS⊆T, the finite collection of events {{Xs6xs}:s∈S}are independent. That is, we have

FXS(xS) =P(∩s∈S{Xs6xs}) =

s∈S

P{Xs6xs}=

s∈S

FXs(xs).

Two stochastic processX,Yfor the common index setTareindependent random processesif for all finite subsetsI,J⊆T, the following events{Xi6xi,i∈I}and

Yj6yj,j∈J are independent. That is,

FXI,XJ(xI,xJ),P {Xi6xi,i∈I} ∩Yj6yj,j∈J =P(∩i∈I{Xi6xi})P ∩j∈JYj6yj =FXI(xI)FXJ(xJ).

Example 1.6 (Bernoulli sequence). Let the Bernoulli sequenceXdefined in Example 1.3 be indepen- dent and identically distributed withP{Xn=1}= p∈(0, 1). For any sequencex∈ {0, 1}N, we have P{X=x}=0. Letq,(1−p), then the probability of observingmheads andrtails is given by pmqr. We can easily compute the mean, the auto-correlation, and the auto-covariance functions for the inde- pendent Bernoulli process defined in Example 1.6 as

mX(n) =EXn=p, RX(m,n) =EXmXn=EXmEXn=p2, Cx(m,n) =0.

3

Referensi

Dokumen terkait

If white noise, or a similarly irregular noise is used as input, then the solution process to a SDAE will not be a usual stochastic process, defined as a random vector at every time

The use of random forest algorithms is included in the part of machine learning which is a computer learning process from the data will be processed in such

Banerji Policy Function Induced Chain The objective depends on the initial stock.. If Kurtz starts with stockx, a givenσ ∈Σinduces a stochastic recursive sequence Xt+1=σXt

Weak stationarity: the stochastic process {Yt} is weakly stationary if the second order moments exist, and satisfy: a The mean of the process μt is constant and do not depend on time

Let 1A be an indicator random variable defined on the probability space Ω,F,Pand event A∈F, then the smallest event space generated by this random variable isσX =σ{∅,Ac,Ω} ={∅,A,Ac,Ω}..

Let Dn be the delay in the queue of the nth customer in a GI/GI/1 queue with independent inter-arrival times Xnand service times Yn.. For the random walkSnwith stepsUn, we can write