• Tidak ada hasil yang ditemukan

Problems and Solutions

ARMA)

3.5 Problems and Solutions

68 3 Autoregressive Moving Average Processes (ARMA)

Solutions

3.1 The traditional way to solve the problem is curve sketching. We consider the first order autocorrelation to be a function ofb:

f.b/D.1/D b 1Cb2: Then the quotient rule for the first order derivative yields

f0.b/D 1Cb2 2b2

.1Cb2/2 D .1 b/.1Cb/

.1Cb2/2 :

The roots of the derivative are given byjbj D 1. Inb D 1 there is a change of sign off0.b/, namely from a negative to a positive slope. Hence, in b D 1 there is a relative (and also an absolute) minimum. Because off.b/being an odd function (symmetric about the origin), there is a maximum inbD1. Therefore, the maximum possible correlation in absolute value is

jf. 1/j Df.1/D 1 2:

One may also tackle the problem by more elementary means. Note that forb¤0 1

jbj 2C jbj D 1 pjbj

pjbj

!2

0 ; which is equivalent to

1

2 1

1

jbjC jbj D jbj

1Cb2 D jf.b/j;

withf.b/D.1/defined above. Sincejf. 1/j Df.1/D 12, this solves the problem.

3.2 BySnwe denote the following sum for finiten:

SnD Xn

iD0

giD1CgC: : :Cgn 1Cgn: Multiplication bygyields

g SnDgCg2C: : :CgnCgnC1:

70 3 Autoregressive Moving Average Processes (ARMA)

Therefore, it holds that

Sn g SnD1 gnC1: By ordinary factorization the formula

SnD 1 gnC1

1 g

and therefore the claim is verified.

3.3 The absolute summability of.h/follows from the absolute summability of the linear coefficientsfcjgallowing for a change of the order of summation. In order to do so, we first apply the triangle inequality:

1 2

X1 hD0

j.h/j D X1 hD0

ˇˇ ˇˇ ˇˇ

X1 jD0

cjcjCh

ˇˇ ˇˇ ˇˇ

X1 hD0

X1 jD0

ˇˇcjcjCh

ˇˇD X1 hD0

X1 jD0

ˇˇcjˇˇˇˇcjChˇˇ

D X1

jD0

ˇˇcjˇˇ X1

hD0

ˇˇcjChˇˇ

!

;

where at the end round brackets were placed for reasons of clarity. The final term is further bounded by enlarging the expression in brackets:

X1 jD0

ˇˇcjˇˇ X1

hD0

ˇˇcjChˇˇ

!

X1 jD0

ˇˇcjˇˇ X1

hD0

jchj

! :

Therefore, the claim follows indeed from the absolute summability offcjg. 3.4 For the proof we denote.1 aL/ 1asP1

jD0˛jLj, 1

1 aLD

X1 jD0

˛jLj;

and determine the coefficients˛j. By multiplying this equation with1 aL, we obtain

1D.1 aL/

X1 jD0

˛jLj01L12L2C: : :

a˛0L1 a˛1L2 a˛2L3 : : : :

Now, we compare the coefficients associated withLjon the left- and on the right- hand side:

1D˛0; 0D˛1 a˛0; 0D˛2 a˛1;

:::

0D˛j a˛j 1; j1 :

As claimed, the solution of the difference equation obtained in this way, (˛j D a˛j 1), is obviously˛jDaj.

3.5We factorizeP.z/D1Cb1zC: : :Cbpzpwith rootsz1; : : : ;zpof this polynomial (fundamental theorem of algebra):

P.z/Dbp.z z1/ : : : .z zp/ : From each bracket we factorize zjout such that

P.z/Dbp. 1/pz1 : : :zp

1 z

z1

: : :

1 z

zp

:

Because ofP.0/D 1, we obtainbp. 1/pz1 : : :zpD1. Therefore the factorization simplifies to

P.z/D

1 z

z1

: : :

1 z

zp

DP1.z/ Pp.z/ ; with

Pk.z/D1 z

zk D1 kz; kD1; : : : ;p;

72 3 Autoregressive Moving Average Processes (ARMA)

wherekD1=zk. From part a) we know that 1

Pk.L/ D X1

jD0

kjLj with X1

jD0

jkjj<1

if and only if

jzkj D 1 jkj > 1 :

Now, consider the convolution (sometimes called Cauchy product) fork¤`:

1 Pk.L/

1 P`.L/ D

X1 jD0

cjLj

with

cjWD Xj

iD0

ki`j i:

We have P1

jD0jcjj < 1 if and only if both Pk1.L/ and P`1.L/ are absolutely summable, which holds true if and only if

jzkj> 1 and jz`j> 1 : Repeating this argument we obtain that

1

P.L/D 1

P1.L/ 1 Pp.L/ D

X1 jD0

cjLj with X1

jD0

jcjj<1

if and only if (3.10) holds. Quod erat demonstrandum.

3.6At first we reformulate the autoregressive polynomialA.z/D1 a1z : : : apzp in its factorized form with rootsz1; : : : ;zp (again by the fundamental theorem of algebra):

A.z/D ap.z z1/ : : : .z zp/ : ForzD1this amounts to

A.1/D ap.1 z1/ : : : .1 zp/ : (3.15)

Because ofA.0/D1we obtain as well:

1D ap. 1/pz1 : : :zp: (3.16) Now we proceed in two steps, treating the cases of complex and real roots separately.

(A) Complex roots: Note that for a rootz12Cit holds that the complex conjugate, z2 D z1;is a root as well. Then calculating with complex numbers yields for the product

.1 z1/.1 z2/D.1 z1/.1 z1/ D.1 z1/.1 z1/ D j1 z1j2> 0 :

Hence, for p > 2, complex roots contribute positively to A.1/in (3.15). If pD2, the roots are only complex ifa2< 0, since the discriminant isa21C4a2; hence,A.1/ > 0by (3.15).

(B) Since the effect of complex roots is positive, we now concentrate on real roots zi, for which it holds thatjzij > 1by assumption. So, we assume without loss of generality that the polynomial has no complex roots, or that all complex roots have been factored out. Two sub-cases have to be distinguished. (1) Even degree: For an evenpwe again distinguish between two cases. Case 1,ap> 0:

Because of (3.16) there has to be an odd number of negative roots and therefore there has to be an odd number of positive roots as well. For the latter it holds that.1 zi/ < 0while the first naturally fulfill.1 zi/ > 0. Hence, as claimed, it follows from (3.15) thatA.1/is positive. Case 2,ap < 0: In this case one argues quite analogously. Because of (3.16) there is an even number of positive and negative roots such that the requested claim follows from (3.15) as well. (2) Odd degree: For an oddpone obtains the requested result as well by distinction of the two cases for the sign ofap. We omit details.

Hence, the proof is complete.

3.7 The normality off"tgimplies a multivariate Gaussian distribution of 0

B@

"tC1

:::

"tCs

1 CAiiNs

0 B@ 0 B@ 0

::: 0

1 CA; 2Is

1 CA

with the identity matrixIsof dimensions. Thes-fold substitution yields xtCsDas1xtCas1 1"tC1C: : :Ca1"tCs 1C"tCs

Das1xtC

s 1

X

iD0

ai1"tCs i:

74 3 Autoregressive Moving Average Processes (ARMA)

The sum over the white noise process has the moments

E

s 1

X

iD0

ai1"tCs i

!

D0; Var

s 1

X

iD0

ai1"tCs i

! D2

s 1

X

iD0

a2i1 ; and, furthermore, it is normally distributed:

s 1

X

iD0

ai1"tCs i N 0; 2

s 1

X

iD0

a2i1

! :

Hence, xtCs given xt follows a Gaussian distribution with the corresponding moments:

xtCsjxt N as1xt; 2

s 1

X

iD0

a2i1

! :

AsxtCscan be expressed as a function ofxtand"tC1; : : : ; "tCsalone, the further past of the process does not matter for the conditional distribution ofxtCs. Therefore, for the entire informationItup to timetit holds that:

xtCsjIt N as1xt; 2

s 1

X

iD0

a2i1

! :

Hence, the Markov property (2.9) has been shown. It holds independently of the concrete value ofa1.

3.8 For

xtDa2xt 2C"t;

we obtain forsD1the conditional expectations E.xtC1jIt/Da2xt 1;and E.xtC1jxt/DE.a2xt 1C"tC1jxt/Da2E.xt 1jxt/ ;

withItD.xt;xt 1; : : : ;x1/:As the conditional expectations are not equivalent, the conditional distributions are not the same. Hence, it generally holds that

P.xtC1xjxt/¤P.xtC1xjIt/ ; which proves thatfxtgis not a Markov process.

References

Andrews, D. W. K., & Chen, H.-Y. (1994). Approximately median-unbiased estimation of autoregressive models.Journal of Business & Economic Statistics, 12, 187–204.

Brockwell, P. J., & Davis, R. A. (1991).Time series: Theory and methods(2nd ed.). New York:

Springer.

Campbell, J. Y., & Mankiw, N. G. (1987). Are output fluctuations transitory?Quarterly Journal of Economics, 102, 857–880.

Fuller, W. A. (1996).Introduction to statistical time series(2nd ed.). New York: Wiley.

Sydsæter, K., Strøm, A., & Berck, P. (1999). Economists’ mathematical manual (3rd ed.).

Berlin/New York: Springer.

Wold, H. O. A. (1938).A study in the analysis of stationary time series. Stockholm: Almquist &

Wiksell.

4