• Tidak ada hasil yang ditemukan

The Wold Decomposition

Problems

2.6 The Wold Decomposition

Applying the operatorP˜nto the second equation and using the properties ofP˜ngives

˜

PnXn+1=(φ+θ )

j=1

(−θ )j1Xn+1j.

Applying the operatorP˜nto the first equation and using the properties ofP˜ngives

˜

PnXn+1=(φ+θ )

j=1

φj1Zn+1j. Hence,

Xn+1− ˜PnXn+1=Zn+1,

and so the mean squared error of the predictorP˜nXn+1isEZn2+12.

2.6 The Wold Decomposition

Consider the stationary process Xt =Acos(ωt)+Bsin(ωt),

whereω∈(0, π )is constant andA,Bare uncorrelated random variables with mean 0 and varianceσ2. Notice that

Xn=(2 cosω)Xn1Xn2= ˜Pn1Xn, n=0,±1, . . . ,

so that Xn− ˜Pn1Xn = 0for alln. Processes with the latter property are said to be deterministic.

The Wold Decomposition:

If{Xt}is a nondeterministic stationary time series, then Xt =

j=0

ψjZtj+Vt, (2.6.1)

where

1. ψ0=1and j=0ψj2<∞, 2. {Zt} ∼WN

0, σ2 ,

3. Cov(Zs,Vt)=0for allsandt, 4. Zt= ˜PtZtfor allt,

5. Vt= ˜PsVtfor allsandt, and 6. {Vt}is deterministic.

Here as in Section 2.5,P˜tY denotes the best predictor ofY in terms of linear com- binations, or limits of linear combinations of 1,Xs,−∞ < st. The sequences {Zt}, {ψj}, and {Vt} are unique and can be written explicitly as Zt = Xt − ˜Pt1Xt, ψj=E(XtZtj)/E

Zt2

, andVt =Xtj=0ψjZtj. (See Brockwell and Davis (1991), p. 188.) For most of the zero-mean stationary time series dealt with in this book (in particular for all ARMA processes) the deterministic component Vt is 0 for all t, and the series is then said to bepurely nondeterministic.

68 Chapter 2 Stationary Processes

Example 2.6.1 If Xt = Ut+Y, where {Ut} ∼ WN 0, ν2

,E(UtY) = 0 for allt, andY has mean 0 and variance τ2, thenP˜t1Xt = Y, sinceY is the mean square limit ass → ∞of [Xt1+ · · · +Xts]/s, andE[(XtY)Xs] =0for allst−1. Hence the sequences in the Wold decomposition of{Xt}are given byZt =Ut, ψ0 =1,ψj = 0forj >0, andVt =Y.

Problems

2.1 Suppose thatX1,X2, . . ., is a stationary time series with meanμand ACFρ(·).

Show that the best predictor ofXn+hof the formaXn+bis obtained by choosing a=ρ(h)andb=μ(1−ρ(h)).

2.2 Show that the process

Xt=Acos(ωt)+Bsin(ωt), t=0,±1, . . .

(whereAandBare uncorrelated random variables with mean 0 and variance 1 andωis a fixed frequency in the interval[0, π]), is stationary and find its mean and autocovariance function. Deduce that the function κ(h) = cos(ωh),h = 0,±1, . . ., is nonnegative definite.

2.3 a. Find the ACVF of the time seriesXt=Zt+0.3Zt1−0.4Zt2, where{Zt} ∼ WN(0,1).

b. Find the ACVF of the time seriesYt = ˜Zt−1.2Z˜t1−1.6Z˜t2, where{ ˜Zt} ∼ WN(0,0.25). Compare with the answer found in (a).

2.4 It is clear that the functionκ(h)=1,h=0,±1, . . . ,is an autocovariance func- tion, since it is the autocovariance function of the processXt =Z,t=0,±1, . . ., where Z is a random variable with mean 0 and variance 1. By identifying appropriate sequences of random variables, show that the following functions are also autocovariance functions:

a. κ(h)=(−1)|h| b. κ(h)=1+cos

πh 2

+cos

πh 4

c. κ(h)=

⎧⎪

⎪⎨

⎪⎪

1, ifh=0, 0.4, ifh= ±1, 0, otherwise.

2.5 Suppose that{Xt,t = 0,±1, . . .}is stationary and that|θ| < 1. Show that for each fixednthe sequence

Sm= m

j=1

θjXnj

is convergent absolutely and in mean square (see AppendixC) asm→ ∞. 2.6 Verify the equations (2.2.6).

2.6 The Wold Decomposition 69

2.7 Show, using the geometric series1/(1−x) = j=0xjfor|x| <1, that1/(1− φz)= − j=1φjzjfor|φ|>1and|z| ≥1.

2.8 Show that the autoregressive equations Xt1Xt1+Zt, t=0,±1, . . . , where {Zt} ∼ WN

0, σ2

and |φ| = 1, have no stationary solution. HINT:

Suppose there does exist a stationary solution {Xt}and use the autoregressive equation to derive an expression for the variance ofXt−φ1n+1Xtn1that con- tradicts the stationarity assumption.

2.9 Let{Yt}be the AR(1) plus noise time series defined by Yt =Xt+Wt,

where{Wt} ∼WN 0, σw2

,{Xt}is the AR(1) process of Example2.2.1, i.e., Xt−φXt1=Zt,{Zt} ∼WN

0, σz2 , andE(WsZt)=0for allsandt.

a. Show that{Yt}is stationary and find its autocovariance function.

b. Show that the time seriesUt:=Yt−φYt1is 1-correlated and hence, by Proposition2.1.1, is an MA(1) process.

c. Conclude from (b) that {Yt}is an ARMA(1,1) process and express the three parameters of this model in terms ofφ, σw2, andσz2.

2.10 Use the program ITSM to compute the coefficientsψj andπj,j= 1, . . . ,5, in the expansions

Xt=

j=0

ψjZtj

and

Zt =

j=0

πjXtj

for the ARMA(1,1) process defined by the equations Xt−0.5Xt1=Zt+0.5Zt1, {Zt} ∼WN

0, σ2 .

(Select File>Project>New>Univariate, then Model>Specify.

In the resulting dialog box enter 1 for the AR and MA orders, specify φ(1) = θ (1) = 0.5, and click OK. Finally, select Model>AR/MA Infinity>Default lag and the values of ψj and πj will appear on the screen.) Check the results with those obtained in Section2.3.

2.11 Suppose that in a sample of size 100 from an AR(1) process with meanμ,φ=.6, andσ2=2we obtainx¯100=0.271. Construct an approximate 95 % confidence interval forμ. Are the data compatible with the hypothesis thatμ=0?

2.12 Suppose that in a sample of size 100 from an MA(1) process with mean μ, θ = −0.6, and σ2 = 1 we obtain x¯100 = 0.157. Construct an approximate 95 % confidence interval forμ. Are the data compatible with the hypothesis that μ=0?

2.13 Suppose that in a sample of size 100, we obtainρ(1)ˆ =0.438andρ(2)ˆ =0.145.

a. Assuming that the data were generated from an AR(1) model, construct approximate 95 % confidence intervals for bothρ(1)andρ(2). Based on these

70 Chapter 2 Stationary Processes

two confidence intervals, are the data consistent with an AR(1) model with φ =0.8?

b. Assuming that the data were generated from an MA(1) model, construct approximate 95% confidence intervals for bothρ(1)andρ(2). Based on these two confidence intervals, are the data consistent with an MA(1) model with θ =0.6?

2.14 Let{Xt}be the process defined in Problem 2.2.

a. FindP1X2and its mean squared error.

b. FindP2X3and its mean squared error.

c. FindP˜nXn+1and its mean squared error.

2.15 Suppose that{Xt,t=0,±1, . . .}is a stationary process satisfying the equations Xt1Xt1+ · · · +φpXtp+Zt,

where{Zt} ∼WN 0, σ2

and Zt is uncorrelated withXs for eachs < t. Show that the best linear predictorPnXn+1ofXn+1in terms of1,X1, . . . ,Xn, assuming n>p, is

PnXn+11Xn+ · · · +φpXn+1p. What is the mean squared error ofPnXn+1?

2.16 Use the program ITSM to plot the sample ACF and PACF up to lag 40 of the sunspot series Dt,t = 1,100, contained in the ITSM file SUNSPOTS.TSM.

(Open the project SUNSPOTS.TSM and click on the second yellow button at the top of the screen to see the graphs. Repeated clicking on this button will toggle between graphs of the sample ACF, sample PACF, and both. To see the numerical values, right-click on the graph and select Info.) Fit an AR(2) model to the mean-corrected data by selectingModel>Estimation>Preliminaryand click Yes to subtract the sample mean from the data. In the dialog box that follows, enter 2 for the AR order and make sure that the MA order is zero and that theYule-Walkeralgorithm is selectedwithoutAICC minimization. ClickOK and you will obtain a model of the form

Xt1Xt12Xt2+Zt, where{Zt} ∼WN 0, σ2

,

for the mean-corrected seriesXt =Dt−46.93. Record the values of the estimated parameters φ12, and σ2. Compare the model and sample ACF and PACF by selecting the third yellow button at the top of the screen. Print the graphs by right-clicking and selectingPrint.

2.17 Without exiting from ITSM, use the model found in the preceding problem to compute forecasts of the next ten values of the sunspot series. (SelectFore- casting>ARMA, make sure that the number of forecasts is set to 10 and the box Add the mean to the forecastsis checked, and then clickOK. You will see a graph of the original data with the ten forecasts appended. Right-click on the graph and then onInfoto get the numerical values of the forecasts. Print the graph as described in Problem 2.16.) The details of the calculations will be taken up in Chapter3when we discuss ARMA models in detail.

2.18 Let{Xt}be the stationary process defined by the equations Xt=Zt−θZt1, t=0,±1, . . . ,

where|θ|<1and{Zt} ∼WN 0, σ2

. Show that the best linear predictorP˜nXn+1

ofXn+1based on{Xj,−∞ <jn}is

2.6 The Wold Decomposition 71

˜

PnXn+1= −

j=1

θjXn+1j.

What is the mean squared error of the predictorP˜nXn+1?

2.19 If{Xt}is defined as in Problem 2.18 and θ = 1, find the best linear predictor PnXn+1ofXn+1in terms ofX1, . . . , Xn. What is the corresponding mean squared error?

2.20 In the innovations algorithm, show that for eachn ≥ 2, the innovationXn− ˆXn

is uncorrelated withX1, . . . ,Xn1. Conclude thatXn− ˆXnis uncorrelated with the innovationsX1− ˆX1, . . . ,Xn1− ˆXn1.

2.21 LetX1, X2,X4, X5be observations from the MA(1) model Xt=ZtZt1, {Zt} ∼WN

0, σ2 .

a. Find the best linear estimate of the missing valueX3in terms ofX1andX2. b. Find the best linear estimate of the missing valueX3in terms ofX4andX5. c. Find the best linear estimate of the missing valueX3in terms ofX1,X2,X4,

andX5.

d. Compute the mean squared errors for each of the estimates in (a)–(c).

2.22 Repeat parts (a)–(d) of Problem 2.21 assuming now that the observationsX1,X2, X4,X5are from the causal AR(1) model

XtXt1+Zt, {Zt} ∼WN 0, σ2

.

3 ARMA Models