Long Memory and Fractional Integration
5.5 Problems and Solutions
118 5 Long Memory and Fractional Integration
0 20 40 60 80 100 120 140 160 180 200
−100
−50 0 50 100
d=1.45
0 20 40 60 80 100 120 140 160 180 200
−20
−10 0 10 20
d=1
0 20 40 60 80 100 120 140 160 180 200
−5 0 5
d=0.55
Fig. 5.6 Nonstationary fractional noise fordD1:45; 1:0; 0:55
or whether a process has short memory or not,
H0W d0 vs. H1W d > 0 :
Demetrescu, Kuzin, and Hassler (2008) suggested a simple procedure similar to the Dickey-Fuller test, to test for arbitrary valuesd0.
Hint: Use properties of the Gamma function
.x/D 8<
: R1
0 tx 1e tdt;x> 0 1; xD0
.xC1/=x; x< 0 ;x¤ 1; 2; : : :
: (5.18)
5.4 Establish the following expression for the autocovariance.h/of fractionally integrated noise:
.h/D2 .1 2d/ .dCh/
.d/ .1 d/ .1 dCh/: (5.19)
Hint: Use Proposition 4.1 with (5.11), and apply the following identity from Gradshteyn and Ryzhik (2000, 3.631.8):
Z
0
sin 1.x/cos.ax/dxD cos.a2 / .C1/
2 1 .CaC12 / . aC12 /; where > 0 : 5.5 Show Proposition5.1(a).
Hint: Use (5.19).
5.6 Show Proposition5.1(b).
Hint: Use (5.19).
5.7 Show Proposition5.1(c).
5.8 Consider the ARFIMA(0,d,1) modeldxtDB.L/ "t,B.L/D1Cb L. Show for this special case that the proportionality constant from Proposition5.3(b) is indeed dfe.0/ 2.
Solutions
5.1 We define thep-series of the firstJterms,
SJ.p/D XJ
jD1
1 jp:
120 5 Long Memory and Fractional Integration First, we discuss the case separating the convergence region from the divergent one .pD1/. For convenience chooseJD2n, such that
S2n.1/D
1C 1 2
C 1
3 C1 4
C C
1
2n 1C1C C 1 2n
> 1 2 C2
4 C4
8 C C2n 1 2n D n
2:
Since the lower bound ofS2n.1/diverges to infinity withn,SJ.1/must diverge, too.
Second, consider the case wherep < 1, such thatjp < j, orj p > j 1. Hence, SJ.p/ >SJ.1/, and divergence ofSJ.1/implies divergence forp< 1.
Third, forp> 1, we group the terms forJD2n 1as follows.
S2n 1.p/D1C 1
2p C 1 3p
C C
1
.2n 1/p C C 1 .2n 1/p
< 1C 2 2p C 4
4p C C 2n 1 .2n 1/p D1C 1
2p 1 C 1
4p 1 C C 1 .2n 1/p 1 :
We now abbreviategD 2p11 with0 <g< 1sincep> 1. Consequently, S2n 1.p/ < 1CgCg2C: : :Cgn 1D 1 gn
1 g
<
X1 iD0
giD 1
1 g D 2p 1 2p 1 1;
where we use the geometric series, see Problem3.2. Hence,S2n 1.p/is bounded for every n, while growing monotonically at the same time, which establishes convergence forp> 1. Hence, the proof of (5.1) is complete.
We want to add a final remark. While convergence is ensured forp > 1, an explicit expression for the limit is by no means obvious, and indeed only known for selected values. For example, forpD 2one has the famous result by Leonhard Euler:
Jlim!1SJ.2/D X1
jD1
1 j2 D 2
6 :
5.2 Since the limit of the ratio of interest is indeterminate, we apply L’Hospital’s rule:
lim
j!1
gj
jd 1 D lim
j!1
j1 d g j D lim
j!1
.1 d/j d g jlog.g/. 1/
D d 1 log.g/ lim
j!1
gj
jd D0 ; d0 :
If 1 < d < 0, the expression gj=jd is still indeterminate. Application of L’Hospital’s rule twice, however, yields:
j!1lim gj
jd D lim
j!1
j d g j D lim
j!1
d j d 1 g 1log.g/. 1/
D d log.g/ lim
j!1
gj j1Cd D0:
This establishes the claim.
5.3 Prior to solving the problem, we review some useful properties of the Gamma function that is often employed to simplify manipulations with binomial expressions, see e.g. Sydsæter, Strøm, and Berck (1999, p.52), and in much greater detail Gradshteyn and Ryzhik (2000, Sect. 8.31), or Rudin (1976, Ch. 8) containing proofs. For integer numbers, coincides with the factorial,
.nC1/Dn.n 1/ 2DnŠ ; which implies a recursive relation holding in fact in general:
.xC1/Dx .x/ : (5.20)
Hence, obviously .1/ D .2/ D 1, and a further value often encountered is .0:5/ D p
. The recursive relation further yields the rate of divergence at the origin,
.x/x 1; x!0 ;
which justifies the convention .0/= .0/D1. Finally, we want to approximate the Gamma function for large arguments. Remember Stirling’s formula for factorials,
nŠp 2n
n e
n
:
122 5 Long Memory and Fractional Integration
It generalizes to
.x/
r2 x
x e
x
forx! 1, which again has to be read as
xlim!1 .x/
,r2 x
x e
x
D1 :
Consequently, we have for finitexandyand large integernthat7 .nCx/
.nCy/nx y; n! 1: (5.21)
Now, we turn to establishing (5.4). Repeated application of (5.20) gives .j d/
. d/ D.j d 1/ .j d 2/ . d/ :
By definition of the binomial coefficients we conclude from (5.3) with .jC1/DjŠ that
jD .j d/
.jC1/ . d/; j0 j d 1
. d/; j! 1;
where the approximation relies on (5.21). This is the required result.
5.4 With Proposition4.1and (5.11) we compute
.h/D2 Z
0
f./cos.h/d
D 4 d 22
Z2
0
sin 2d.x/cos.2hx/dx;
7Use.1Cx=n/n!ex.
where we substituted 2 Dx. Both sin2.x/and cos.2hx/are even symmetric around
2, such that Z
0
sin 2d.x/cos.2hx/dxD2
Z2
0
sin 2d.x/cos.2hx/dx:
Hence, the integration formula 3.631.8. from Gradshteyn and Ryzhik (2000) can be applied withD1 2d> 0as long asd < 0:5andaD2h:
.h/D 4 d2
Z
0
sin 2d.x/cos.2hx/dx
D 4 d2
cos.h/ .2 2d/
2 2d.1 2d/ .1 dCh/ .1 d h/
D2 . 1/h .2 2d/
.1 2d/ .1 dCh/ .1 d h/
D2 . 1/h .1 2d/
.1 dCh/ .1 d h/: Using (5.20) once more, one can show that
.dCh/
.d/ D. 1/h .1 d/
.1 d h/: Therefore, we finally have
.h/D2 .1 2d/ .dCh/
.d/ .1 d/ .1 dCh/; which is (5.19).
5.5 With (5.19) we obtain
.0Id/D .1 2d/
. .1 d//2;
where we assumed2 D 1without loss of generality. Instead of the variance, we will equivalently minimize the natural logarithm thereof. We determine as derivative
@log..0Id//
@d D 2
0.1 2d/
.1 2d/
0.1 d/
.1 d/
D 2 . .1 2d/ .1 d// ;
124 5 Long Memory and Fractional Integration where the so-called psi function is defined as logarithmic derivative of the Gamma function,
.x/D @log. .x//
@x :
The psi function is strictly increasing forx > 0, which can be shown in different ways, see e.g. Gradshteyn and Ryzhik (2000, 8.362.1). Consequently,
.1 d/ .1 2d/
8<
:
> 0 ; 0 <d< 0:5 D0 ; dD0
< 0 ; d< 0
;
which proves that log..0Id//, and hence.0Id/, takes on its minimum fordD0.
This solves the problem.
5.6 The recursive relative for.h/is obvious with (5.19) and (5.20) at hand.
Forh! 1, the approximation in (5.21) yields
.h/2 .1 2d/
.d/ .1 d/h2d 1; which defines the constant from Proposition5.1(b):
d D .1 2d/
.d/ .1 d/:
The Gamma function is positive for positive arguments and negative on the interval . 1; 0/. Hence, the sign of d equals the sign of .d/ since d < 0:5, which completes the proof of Proposition5.1(b).
5.7 In terms of autocorrelations the recursion from Proposition5.1(b) becomes .hId/Df.hId/ .h 1Id/ ; h1 ;
where the factorf.hId/is positive ford> 0, f.hId/D h 1Cd
h d > 0 with @f.hId/
@d > 0;
such that.hId/ > 0since.0Id/D1. Hence, we have
@.hId/
@d D @f.hId/
@d .h 1Id/Cf.hId/@.h 1Id/
@d ;
which is positive by induction since
@.1Id/
@d D @f.1Id/
@d > 0:
Hence,.hId/is growing withdas required.
5.8 The spectral density of the MA(1) componentet D "tCb"t 1is known from Example4.3:
fe.0/D.1Cb/22=2:
We now expressxtin terms of a fractional noise calledytD d"t: xt D dB.L/"tDB.L/ d"t
D.1CbL/yt:
Letx.h/andy.h/denote the autocovariances offxtgandfytg, respectively. It holds that
x.h/DEŒ.ytCbyt 1/ .ytChCbytCh 1/
D.1Cb2/y.h/Cb
y.h 1/Cy.hC1/
: With the behavior ofy.h/from Proposition5.1it follows
x.h/
h2d 1 !.1Cb2/ d2C2bd2 D dfe.0/ 2 ;
as required.
References
Baillie, R. T. (1996). Long memory processes and fractional integration in econometrics.Journal of Econometrics, 73, 5–59.
Bondon, P., & Palma, W. (2007). A class of antipersistent processes. Journal of Time Series Analysis, 28, 261–273.
Brockwell, P. J., & Davis, R. A. (1991).Time series: Theory and methods(2nd ed.). New York:
Springer.
Demetrescu, M., Kuzin, V., & Hassler, U. (2008). Long memory testing in the time domain.
Econometric Theory, 24, 176–215.
Dickey, D. A., & Fuller, W. A. (1979). Distribution of the estimators for autoregressive time series with a unit root.Journal of the American Statistical Association, 74, 427–431.
Giraitis, L., Koul, H. L., & Surgailis, D. (2012).Large sample inference for long memory processes.
London: Imperial College Press.
126 5 Long Memory and Fractional Integration
Gradshteyn, I. S., & Ryzhik, I. M. (2000). Table of integrals, series, and products (6th ed.).
London/San Diego: Academic Press.
Granger, C. W. J. (1981). Some properties of time series data and their use in econometric model specification.Journal of Econometrics, 16, 121–130.
Granger, C. W. J., & Joyeux, R. (1980). An Introduction to long-memory time series models and fractional differencing.Journal of Time Series Analysis, 1, 15–29.
Hassler, U. (2012). Impulse responses of antipersistent processes. Economics Letters, 116, 454–456.
Hassler, U. (2014). Persistence under temporal aggregation and differencing.Economics Letters, 124, 318–322.
Hassler, U., & Kokoszka (2010). Impulse responses of fractionally integrated processes with long memory.Econometric Theory, 26, 1855–1861.
Hosking, J. R. M. (1981). Fractional differencing.Biometrika, 68, 165–176.
Maasoumi, E., & McAleer, M. (2008). Realized volatility and long memory: An overview.
Econometric Reviews, 27, 1–9.
Rudin, W. (1976).Principles of mathematical analyis(3rd ed.). New York: McGraw-Hill.
Sydsæter, K., Strøm, A., & Berck, P. (1999). Economists’ mathematical manual (3rd ed.).
Berlin/New York: Springer.
Trench, W. F. (2013).Introduction to real analysis. Free Hyperlinked Edition 2.04 December 2013.
Downloaded on 10th May 2014 fromhttp://digitalcommons.trinity.edu/mono/7.