• Tidak ada hasil yang ditemukan

Problems and Solutions

Basic Concepts from Probability Theory

2.5 Problems and Solutions

2.10 Derive the properties (b) and (c) from Proposition2.2.

Hint: Use statement (a).

Solutions

2.1 Assuming finite fourth moments we define for a random variableXwith1 D E.X/:

2D E.X 1/4

4 :

Consider the standardized random variableZwith expectation 0 and variance 1:

ZD X 1

with E.Z2/D1:

For this random variable, it holds that2 D E.Z4/. ReplacingXbyZ2 in (2.2), it follows

1D E.Z2/2

E Z4

D2; which proves the claim.

2.2 For thek-th moment it holds:

E.Xk/D Z 1

1

xkf.x/dxD Z 1

1

xk 1dx:

1. case: If ¤k, then the antiderivative results in Z

xk 1dxD k xk : The corresponding improper integral is defined as limit:

Z 1

1

xk 1dxD lim

M!1

k

xk M 1 : For >kit follows that

Z 1

1

xk 1dxD0

k D

k <1:

For <k, however, no finite value is obtained asMk goes off to infinity.

2.5 Problems and Solutions 37 2. case: ForDkthe antiderivative takes on another form:

Z

xk 1dxD Z

x 1dxD log.x/:

As the logarithm is unbounded, or log.M/! 1forM! 1, one cannot obtain a finite expectation either, as the upper bound of integration is1.

Both cases jointly prove the claim.

2.3 We provide two proofs. The first one builds on the fact that (2.4) is a special case of (2.3). The second one is less abstract and more elementary, and hence instructive, too.

1. Note that.X /2 is a nonnegative random variable. Therefore, (2.3) applies withaD"2:

P..X /2"2/ E..X /2/

"2 :

The event .X /2 "2, however, is equivalent to jX j ", which establishes (2.4).

2. Elementarily, we prove the claim for the case thatXis a continuous random vari- able with density functionf; the discrete case can be accomplished analogously.

Note the following sequence of inequalities:

Var.X/D Z 1

1

.x /2f.x/dx

Z "

1

.x /2f.x/dxC Z 1

C"

.x /2f.x/dx

Z "

1

"2f.x/dxC Z 1

C"

"2f.x/dx:

The first inequality is of course due to the omittance of Z C"

"

.x /2f.x/dx0:

The second one is accounted for by the fact that for the integrands of the respective integrals it holds that:

x < " for x< "

and

x > " for x> C":

Up to this point, it is therefore shown that:

Var.X/ "2P.X "/C"2P.XC"/

D"2P.jX j "/:

This is equivalent to the claim.

2.4 The marginal density is obtained as follows:

fx.x/D Z 1

1

fx;y.x;y/dy D

Z b 0

1 abdy D b 0

ab D 1

a forx2Œ0;a ; andfx.x/D0forx…Œ0;a. It also holds that

fy.y/D (1

b; y2Œ0;b

0 ; else :

Hence, one immediately obtains for allxandy:

fx;y.x;y/Dfx.x/fy.y/;

which was to be proved.

2.5 Obviously, the expected value ofXis zero,

E.X/D50Px.XD50/ 50Px.XD 50/D0:

Therefore, it holds for the variance that:

Var.X/DEŒ.X E.X//2DE.X2/

D502Px.XD50/C. 50/2Px.XD 50/

D 2500

2 C 2500

2 D2500:

2.5 Problems and Solutions 39

AlsoYis zero on average:

E.Y/D 10Py.Y D 10/ 20Py.Y D 20/ 30Py.Y D 30/

40Py.YD 40/C0Py.Y D0/C100Py.YD100/

D 1

6 . 10 20 30 40C100/D0 : Hence, the variance reads

Var.Y/D 1 6

. 10/2C. 20/2C. 30/2C. 40/2C02C1002

D2166:67:

For the covariance we obtain

Cov.X;Y/DEŒ.X E.X// .Y E.Y//

DE.X Y/

D X2

iD1

X6 jD1

xiyjPx;y.XDxi;Y Dyj/:

In order to compute it, the entire joint probability distribution is to be established:

Px;y.XD 50;YD 40/DP.f1; 3; 5g \ f4g/DP.;/D0;

Px;y.XD50;Y D 40/DP.f2; 4; 6g \ f4g/DP.f4g/D 1 6; Px;y.XD 50;YD 30/DP.Ec\ f3g/DP.f3g/D 1

6; Px;y.XD50;YD 30/DP.E\ f3g/DP.;/D0:

We may collect those numbers in a table:

YD 40 30 20 10 0 100

XD 50 0 16 0 16 16 0

XD50 16 0 16 0 0 16

Plugging in yields E.X Y/D 1

6 Œ 5040C5030 5020C5010C500C50100D666:67:

Therefore one obtains for the correlation coefficient apart from rounding errors xyD0:286.

2.6 It only remains to be shown that:

E.jYjjZj/p E.Y2/p

E.Z2/:

In order to see that we use the binomial formula and obtain Y2

E.Y2/

2jYjjZj pE.Y2/p

E.Z2/C Z2

E.Z2/ D jYj pE.Y2/

jZj pE.Z2/

!2

0:

Therefore, the expectation of the left hand side cannot become negative, which yields:

1 2E.jYjjZj/ pE.Y2/p

E.Z2/C1D2 1 E.jYjjZj/ pE.Y2/p

E.Z2/

! 0:

In particular, it can be observed that the expression is always positive except for the caseY DZ. Rearranging terms verifies the second inequality from (2.8).

2.7 Plugging in X E.X/ and Y E.Y/ instead of Y and Z in (2.5) by Cauchy-Schwarz it follows that

jEŒ.X E.X//.Y E.Y//j p

EŒ.X E.X//2p

EŒ.Y E.Y//2;

which is the same as:

jCov.X;Y/j p

Var.X/p Var.Y/:

This verifies the claim.

2.8 Due to

Fx;y.x;y/D Z y

1

Z x 1

fx;y.r;s/dr ds;

fx;y is determined by taking the partial derivative of Fx;y with respect to both arguments:

@2Fx;y.x;y/

@x@y D @.1Ce xCe y/ 2e x

@y D 2e xe y

.1Ce xCe y/3 Dfx;y.x;y/:

2.5 Problems and Solutions 41

The marginal distribution ofYis determined by Fy.y/D

Z y 1

Z 1

1

fx;y.x;s/dxds D lim

x!1Fx;y.x;y/D.1Ce y/ 1: The marginal density therefore reads

fy.y/D e y .1Ce y/2: Division yields the conditional density:

fxjy.x/D fx;y.x;y/

fy.y/

D 2e x.1Ce y/2 .1Ce xCe y/3:

2.9 The following sequence of equalities holds and will be justified in detail. The first two equations define exactly the corresponding (conditional) expectations. For the third equality, the order of integration is reversed; this is due to Fubini’s theorem.

The fourth equation is again by definition (conditional density), whereas in the fifth equation only the density ofYis cancelled out. In the sixth equation, the influence of Yon the joint density is integrated out such that the marginal density ofXremains.

This again yields the expectation ofXby definition. Therefore, it holds that

Ey.Ex.XjY//D Z1

1

Ex.Xjy/fy.y/dy

D Z1

1

2 4

Z1

1

x fxjy.x/dx 3

5fy.y/dy

D Z1

1

x 2 4

Z1

1

fxjy.x/fy.y/dy 3 5dx

D Z1

1

x 2 4

Z1

1

fx;y.x;y/

fy.y/ fy.y/ dy 3 5dx

D Z1

1

x 2 4

Z1

1

fx;y.x;y/dy 3 5dx

D Z1

1

x fx.x/dx DEx.X/ ; which was to be verified.

2.10 We use statement (a), E.xtjxt h/D 0forh > 0, connected with the law of iterated expectations:

E.xt/DEŒE.xtjxt h/DE.0/D0:

This proves (b), that martingale differences are also unconditionally zero on average.

By applying both results of Proposition2.1forh> 0again with (a), one arrives at:

E.xtxtCh/DEŒE.xtxtChjxt

DEŒxtE.xtChjxt

DEŒxt

D0:

Therefore, Cov.xt; xtCh/ D 0 forh > 0. However, as the covariance function is symmetric inh, the result holds for arbitraryh ¤ 0 which was to be verified to show (c).

References

Bickel, P. J., & Doksum, K. A. (2001).Mathematical statistics: Basic ideas and selected topics, volume 1(2nd ed.). Upper Saddle River: Prentice-Hall.

Billingsley, P. (1986).Probability and measure(2nd ed.). New York: Wiley.

Breiman, L. (1992). Probability (2nd ed.). Philadelphia: Society for Industrial and Applied Mathematics.

Brockwell, P. J., & Davis, R. A. (1991).Time series: Theory and methods(2nd ed.). New York:

Springer.

Davidson, J. (1994). Stochastic limit theory: An introduction for econometricians.

Oxford/New York: Oxford University Press.

Grimmett, G. R., & Stirzaker, D. R. (2001).Probability and random processes(3rd ed.). Oxford:

Oxford University Press.

Klebaner, F. C. (2005).Introduction to stochastic calculus with applications(2nd ed.). London:

Imperical College Press.

Ross, S. (2010).A first course in probability(8th ed.). Upper Saddle River: Prentice-Hall.

Rudin, W. (1976).Principles of mathematical analyis(3rd ed.). New York: McGraw-Hill.

References 43

Sydsæter, K., Strøm, A., & Berck, P. (1999). Economists’ mathematical manual (3rd ed.).

Berlin/New York: Springer.

Trench, W. F. (2013).Introduction to real analysis. Free Hyperlinked Edition 2.04 December 2013.

Downloaded on 10th May 2014 fromhttp://digitalcommons.trinity.edu/mono/7.

3