RANDOM VALUES
Remark 3. The results of this theorem will usually be used in the form of the statement «The functions of independent random variables are also independent
2.4. Conditional distributions
186
187
The exponential distribution, together with its discrete analogue – geometric distribution P
^
[ k`
qk1q, k 0,1,... , is the only absolutely continuous distri- bution possessing the above-noted remarkable property. The latter is a consequence of the uniqueness of the solution of the functional equation obtained from (29 ')P x a P x P a .
We now discuss in more detail the cases of discrete and absolutely continuous random variables.
If [ K, are the discrete random variables, pij their joint distribution law, then fori j, 1,2,... (under the conditions when the denominators of the fractions below are different from zero):
^ `
i ij i
j
p
¦
p P [ x , /^ `
ij
i j i j
j
p P x y p
[ K p
,
ij j i j
p p
p
, i j i j 1 j ijj j j
p p p p p
p
¦ ¦
, ect,but the conditional distribution function
^ `
: :
1
i j i
j i j ij
i x x i x x
F x y P x y p
[ K [ K p
d d
¦ ¦
.It is clear that analogous relations for conditional distributions can also be written out in the cases of three, four, ..., and, in general of any, finite number of discrete random variables.
For absolutely continuous random variables, we need to introduce the notion of conditional distribution density, taking into account the remarks made above about integrals.
If
[ K
, are absolutely continuous random variables, then for any y R pro- babilityP^
K y`
0, therefore we cannot determine the conditional distribution function F[ Kx y P^
[dx K y`
directly by formula (29). Therefore, at this point, we first define the conditional distribution function with respect to the event^ `
B y d 'K y y according to formula (29), then we pass to the limit for ' oy 0 and it is precisely the resulting limit function that we understand as the conditional distribution function F[ K x y .
So,
^ `
,,
y y x
y y y
y
f u dud
F x B P x y y y
f d
[ K [
K
- -
[ K
- -
'
f
d d '
³ ³
'³
,
188
,
0
, lim
x
y
f u y du
F x y F x B
f y
[ K
[ K [
K f ' o
³
,where f[ K, x y, is the joint density of the distribution of random variables [ K, and f yK is the density of the distribution of the random variable K. The derivative with respect to x of the conditional distribution function F[ K x y will be called the conditional density of the distribution of the random variable [ when the condition K y is satisfied and denote it by f[ K
x y.So
f , x y,
f x y
f y
[ K [ K
K
, f yK z0. (30) From this
, ,
f[ K x y f y fK [ K x y . (31) The last formula reminds us of the formula for multiplication of probabilities, therefore formula (31) will be referred to as the density multiplication formula.
It is not difficult to guess that the following important formulas also hold:
,
,
, .
x
F x y f u y du
f x f x y dy f y f x y dy
[ K [ K
[ [ K K [ K
f
f f
f f
³
³ ³
(32)
The second of formulas (32) is an analogue of the formula of total probabilities for continuous random variables.
Example 17. Consider a two-dimensional normal vector
[ K, . Recalling that the distribution density of such a random vector is determined by formula (10») and writing down the formula for the marginal distribution density f yK , for the conditional distribution density [ under the condition K y, we obtain the formula
2 12
2
1 2 2 1
1 2
1 V U
K
[ S U V
y m x
e y
x
f ,
where 1 1
22
m y a V U y a
V . Note that this function is the distribution density
m y,V121U2
N of a normal randomly distributed random variable.
189
Example 18. Let [ be a nonnegative random variable
[!0 and its distribu- tion density f x[ f x be a continuous function. Suppose first that a random va- riable [ is observed, then a random variable K is defined as a random variable distri- buted uniformly in the interval 0,[ . In addition, we assume that the random variables K and [ K are independent.We will show that then the distribution density of a random variable [ is given by
2 , 0,
0, 0,
a xe ax x f x f x
[ x
!
®¯
where a is a positive constant.
Solution. From the condition of the problem, we obtain that the conditional distribution density K under the condition [ x is the function
1
f y x
K [ x (0 y x), fK [
y x 0 (y x! ).If we denote f yK g y by (32), then
y y
g y f x f y x dx f x dx
K [ x
f f
³ ³
,consequently
g y f y
c y , f y yg yc . Next, we write the condition of independence of K and [ K :
, , , ,
fK [ y x fK [ K y x y f y fK [ K x y g y f [ K x y . On the other hand
,
/
1 ,
/ f y x g y f x y f y x
x f x f x
K [ [ K
K [
( 0 y x), means
1
g y f x y f x
[ K x . Whence yox we obtain the relation
x axg f
x xg x
f [K 0 ,
190
where a f[ K 0 !0. Substituting this value into the equation found earlier y
g y y
f c , we have
x g x x
axg c , g x ceax. It follows
0
1 g x dx
f
³
from the condition that c a. So2 ax
f x axg x a xe (x!0).
2.5. Tasks for independent work
1. Let [, K~N 0,V2 be independent random variables. Show that then [1 [2K2 and K
[2 [ are also independent random variables.
2. [ F~ n2, K F~ m2 , [ and K are independent.
Show that then [1 [K and K
[2 [ are also independent random variables.
3. [1 and [2 are independent random variables uniformly distributed on (0,1). Show that random variables K1 Ucos ,M K2 UsinM, where U 2ln[1, M 2S [2 are independent normal random variables with parameters (0,1).
4. [~Bi( ; )Q p . In its turn ~v Bi m q( ; ). Show that then[~Bi m pq( ; ).
Direction. First, find the joint law of distribution of random variables [ and Q, after finding the marginal distribution law [.
5. First let us observe a Poisson random variable [ with a parameter O. After the independent testing [ of Bernoulli with a probability of success p.
Find the law of distribution of a random variable K- the number of successes in the last test.
6. Suppose that for each fixed O!0 random variable [~3(O), and O in turn has a gamma distribution with a parameter D, i.e. its distribution density is the function
1 1
, 0,
( ) ( )
0, 0,
x e x x Г
f x
x
D
O ®° D t
°
¯
where D is a fixed positive number, and * D is a gamma function.
Show that the distribution law [ is defined by formulas
,...
2 , 1 , 0 2 ,
1 ) 1 ( ) (
) } (
{ ¸
¹
¨ ·
©
§
k k Г Г
k k Г
P
D k
D
[ D .
Direction. First, show that for any event A and continuous random variable K, the formula
191
f³
f
dx x f x A P A
P K K ,
then apply this formula for the case A ^[ k`, K O.
7. [ ~3(O), in turn, O is an exponential random variable with a parameter с. Find the law of distribution of a random variable [.
8. Continuation. In the previous task, the parameter O, in turn, obeys the gamma distribution of order D with the scale parameter с (c!0,D! 1), i.e. the distribution density O is given by
( ) 1 ( 0); ( ) 0 ( 0)
( 1) x cx
f x c e x f x x
D D
O D O
t
* .
Show that the law of distribution of a random variable [ is determined by the relations
^ ` ( 1) 1 1 1, 0,1, 2...
! ( 1) 1
k k
P k c k
k c
D
D D
[ D
* § ·
¨ ¸
* © ¹ .
9. [ ~Bi(n,p). In turn, p is a beta-distributed random variable with parameters r and s, i.e.
the distribution density p is given by
1 1
( )
( ) (1 ) , 0 1
( ) ( )
r s
p
Г r s
f x x x x
Г r Г s
; fp(x) 0, x 0,1.
а) Show that the law of distribution of a random variable [ is determined by the probabilities
( ) ( ) ( )
{ } , 0,1, 2,..., .
( ) ( ) ( )
k n
Г r s Г k r Г n k s
P k C k n
Г r Г s Г n r s
[
b) Show that this distribution r s 1 is a uniform distribution on the set {0,1,2,...,n} the distribution [ is determined by the relations 1
{ }
P k 1 [ n
.
10. [1~Bi n p( , ); A random variable [2 when the condition [1 is satisfied is a binomial random variable with parameters ([1,p); A random variable [3 when the condition is satisfied [2 is a binomial random variable with parameters ([2,p), etc.
Show that then the random variable [k is a binomial random variable with parameters (n,pk), i.e. show that [k ~Bi n p( , k).
11. [1,[2 are independent random variables: [1~ (0,N V12), [2~ N(0,V22). Show that then
1 2 2
2 2
1 2
(0, )
[ [ [ V
[ [ ~ N ,
where the parameter V2 is related to the given parameters V V12, 22 by the relation 2 2 2
1 2
1 1 1
V V V . Direction. Use the ratio 2 2 2
1 2
1 1 1
[ [ [ .
192
12. [,K are independent random variables: 1
~ 0, ,
[ §¨ n·¸
© ¹
N i.e. [ is a random variable with a distribution density
2
( ) 2 ,
2 n nx
f x[ e
S
and n ,
n
K F its distribution density
1 2
2 2
( ) 2
2
n nx
n x n
f x e
Г n
K
§ ·
¨¨ ¸¸
§ · © ¹
¨ ¸© ¹
, (xt0).
Find the distribution density of a random variable ] [ K.
13. Let [1,[2,...,[n be independent identically distributed exponential random variables with parameters O: f (t) e t,tt0
i
[ O O (distribution density).
a) Show the validity of the formula
^ 1 2 ` 1
0
... !
n j n
j
t t
P t e
j
O O [ [ ! [ ¦ .
After using this formula, find the density of the distribution of the sum Sn [1 ... [n. b) Find the distribution density Sn [1 ... [n using the composition formula
(formulas (17)).
14. Continuation. Find the distribution density of a random variable
1
1 2 ... n
K [
[ [ [ .
15. [1,[2,... are a sequence of independent identically distributed exponential random variab- les with parameters O,
0 0, 1 1,..., n 1 2 ... n,...
S S [ S [ [ [ . We define a random variable vt as the number of indices kt1:
^ `
¦f d 1
k S t
t I k
v , satisfying the condition Skdt:, where IA is the indicator event A. Show that vt is a Poisson random variable with the parameter Ot: vt ~3 Ot .
16. Record values. Let [0,[1,[2,...,[n be independent indivisible random variables distributed with parameters D. We define the random variable Q from the condition:
^ : 1 0, 2 0,..., 1 0, 0`
min n [ [ [ [ [n [ [n ![
v .
Find the distribution function and the distribution density of the first record random variable [Q. Direction. Notice
^ d ` ¦f^ d ` ¦f^ d ! `
1 1 0 2 0 1 0 0
1
, ,...,
, , ,
n n n n
n
x n
x
x [ Q [ [ [ [ [ [ [ [ [
[Q Q
and find the probability of this event.
193
17. Let [1,[2,...,[n,...be a sequence of mutually independent randomly distributed random variables on a segment > @0,a ,
0 0
S ,S1 [1, S2 [ [1 2,…, Sn [ [1 2 ... [n,....
^ `
( ) ( ) ,
Sn n n
F x F x P S dx )
(x
fn F xnc( ).
Then using the composition formula, show that for n 1,2,... the formula
> @
1 0
1 1
( ) a ( ) ( ) ( )
n n n n
f x f x y dy F x F x a
a a
³ .
Further, applying induction, obtain the following formulas
0
( ) 1 ( 1) ( ) ;
!
n k k n
n n n
k
F x C x ka
a n
¦ 1 1 1 1
0
( ) 1 ( 1) ( )
!
n k k n
n n n
k
f x C x ka
a n
¦ ,
where | |
2 x x
x is the positive part of the real number x (x x,ifx!0, x 0, if xd0) and
n
n x
x ( ) .
18. Continuation. Show that if the random variables [1,[2,...,[n are independent and uniformly distributed on a segment >b,b@, then the distribution density of a random variable
n
Sn [1[2...[ is determined by the formula
1 0
( ) 1 ( 1) ( 2 )
(2 ) ( 1)!
n k k n
n n n
k
f x C x kb
b n
¦ .
19. Triangular densities. Show that the distribution density of two uniformly distributed on
>a a, @ independent random variables, i.e. two independent random variables with distribution densities
x a fa
2 ) 1
( , |x|a; f xa 0, xta is a function
¸¹
¨ ·
©§ a x x a
ga | |
1 1 )
( , |x|a; ga x 0, xta,
i.e. a triangular distribution density, in other words, show thatfafa(x) ga(x).
20. Let [,K be independent random variables distributed according to the Cauchy law with parameters, respectively a, , i.e. they are independent random variables, respectively, with densitiesb
fxf xa x a
a( ) 2 2 ,
J S and Jb(x).
Show what then JaJb(x) Jab(x).
Further, from what has been said, if [1,[2,...,[n are independent randomly distributed random variables with parameters D, equally distributed according to Cauchy's law, then the distribution density of the mean
n [n
[
[ [1 2... coincides with: JD(x): f[(x) JD(x).
194
21. Continuation. Let [1,[2 be independent random variables distributed according to the Cauchy law JD(x) with a distribution density.
Show that the distribution density of the sum K K1 2 of random variables
2 1 2 2 1
1 [ [ , K [ [
K a b c d is a function J(abcd)D(x), in other words, this density is a convolution of the function J(ab)D(x) (the density of distribution of a random variable K1) and a function
)
) (
(cdD x
J (the density of distribution of a random variable) K2.
22. Let [1,[2,... be independent identically distributed random variables with distribution densities f(x) xex, xt0; f(x) 0, x0. We introduce the notation
0 0
S , Sn [1[2...[n,…, n 1,2,..., and define a new random variable Qt as follows:
^ 1 2 1 `
min : , ,..., ,
t n n
v n S dt S dt S dt S !t . Find the distribution law Qt.
23. Let
1 2
, ~ (0,0), 1 1 [ [ U
U
§ § ··
¨ ¨ ¸¸
© ¹
© ¹
N .
We introduce the random variables (polar coordinates) r,M :
2,
2 2
1 [
[
r .
1 2
[ M arctg[
Show that the distribution density of a random variable φ is a function 1 2
( ) , 0 2
2 (1 2 sin cos )
fM\ U \ S
S U \ \
;
fM \ 0,\0,2S,
and the random ߮ on only U 0 turns out to be uniformly distributed and independent of r.
24. [1,[2,...,[n are independent equally distributed and taking the values +1 and -1, respectively, with probabilities p and 1p q random variables Kn [ [1 2 ...[n.
Prove that then
^ 1` 1 1 ( )
2
n
P Kn ª¬ p q º¼, ^ 1` 1 1 ( ) 2
n
P Kn ª¬ p q º¼. 25. Let
1 2 3
1 0 0
, , ~ 0,0,0 , 0 1 0
0 0 1 [ [ [ [
§ § ··
¨ ¨ ¸¸
¨ ¨ ¸¸
¨ ¸
¨ © ¹¸
© ¹
N .
We define a new random vector
K
[ [ [ [ [1, 2 1, 3 1 K K K1, ,2 3.195 Find the following quantities:
a) the density of the distribution of the vector K; b) marginal distribution density , ( 2, 3)
3
2 x x
fK K ; c) conditional distribution density fK1,K2/K3(x1,x2/y); d) marginal distribution density fK2(x2);
e) conditional distribution density fK2/K3(x2/y); f) conditional density of distribution fK1/K2,K3(x1/y,z).
26. Obtaining exponential random variables through uniformly distributed random variables.
Let [1,[2,... be a sequence of random variables uniformly distributed on >0,1@. We define a new random variable as follows:
^ 1 2 1 1 `
min n: ... n , n n
Q [ [t t t[ [ [ . Prove that then
^ 1 , ` 1
( 1)! !
n n
x x
P x n
n n
[ d Q
, then show that this implies P^[1dx,Qan even number} 1ex.
27. Continuation. For tests take random values [1,[2,...,[Q: if Q is an odd number, we will assume that there was a failure, if Q is an even number, then we believe that there was success.
Independent tests will be conducted until the first success occurs. The new random variable K is defined as the sum of failures and the first value in a successful test.
Prove that then K is an exponential random variable with a parameter O 1, i.e. random variable with distribution function P^Kdx}=1ex.
Let [1,[2,...,[n (nt2) be independent identically distributed random variables defined on a probability space :, ,F P. For each Z: number [ Z [ Z1( ), ( ),..., ( )2 [ Zn we arrange in ascending order and re-number them: [(1)d[(2)d d... [( )n . So the constructed sequence of random variables is called the variational series corresponding to random variables ξ1, ξ2,…,ξn, and the fifth term of this series [(k) is called the kth member of the variational series. For example, in special cases
^[ [ [n` [n ^[ [ [n`
[(1) min 1, 2,..., , ( ) max 1, 2,..., . It is clear that random variables ξ(1), ξ(2),…,ξ(n) are not independent random variables (they are subject to the above inequalities).
28. Let [1,[2,...,[n be independent identically distributed exponential random variables with a parameter O, [(1),[(2),...,[(n) are the corresponding variational series.
Show that then [(1),[(2) [(1),[(3) [(2),... ...,[(n)[(n1) are independent random variables, and the densities of the distributions of random variables [(k)[(k1), k 1, 2,...,n [0 0 are respectively functions On k 1 e (n k 1)Ot (t!0), i.e. [(k)[(k1) (k 1,2,...,n) is an exponentially distributed random variable with a parameter Ok (n k 1)O.
29. [1,[2,[3 are independent N 0,1 random variables, [1,[2,[3 are the corresponding vaccination series.
Find the distribution densities of the following random variables and random vectors:
а) [ [ [ [2 1, 3 1; b) [ [ [ [2 1, 3 2; c)
[2 [ [1, 3 [1 ;d) 2 1
3 1
[ [ K [ [
; e)
[2 [ [ 1, 3 [2; f) 2 13 2
[ [ [ [
196
30. [1,[2,...,[n are independent identically distributed random variables, their distribution functions are equal ( ) ( )
F x[i F x , their distribution densities are equal ( )f x[i f x( )and [(1)d[(2)d d... [( )n is the corresponding variational series.
Show the validity of the following formulas:
а) ( )k( )
^
( )k`
n nl ( ) 1l ( )n l;l k
F x P [ dx ¦C F x F x b) f( )k ( )x F( )ck( )x k C F xnk ( ) k1 1F x( )n k f x( );
c) F( ),( )k l ( , )x y P
^
[( )k dx,[( )l dy`
> @ > @
max( )
! ( ) ( ) ( ) 1 ( ) ,
! !( )!
n n i i j n i j
i k j l i
n F x F y F x F y
i j n i j
¦ ¦ if xy;
F(k),(l)(x,y) F(l)(y), if xty , where F(l)(y) – the function defined in case a);
2 ( ),( ) ( ),( )
( , ) !
( , )
( 1)!( 1)!( )!
k l k l
F x y n
f x y
x y k l k n l
w
w w
F x( )k1f x F y( )> ( ) F x( )@l k1 f y( ) 1 F y( )n l ,
if xy;
0 ) ,
)(
( ),
( x y
fk l , if xty.
31. Let [1,[2,… be independent identically distributed with parameters Oexponential random variables, S0 0, S1 [1,…, Sn [ [1 2 ... [n,…, and the new random variable Qt is defined as follows: for t!0
^ 0 1 1 `
min : , ,..., ,
t k S t S t Sk t Sk t
Q t .
Show that then the distribution density of a random variable [Qt is the function
2 , 0 ,
1 ,
x
t x
xe x t
f x t e x t
O O
O O O
d
®°°¯ ! .
32. Continuation. Show that a random variable SStt SSSSQttt is an exponential random variable with a parameter O.
197 Chapter