# 完整详尽的课后习题答案-时间序列分析及应用（R语言原书第2版）

4星（超过85%的资源）

1631

Exercise 2. 8 Suppose that Y, is stationary with autocovariance function k. Show that for any fixed positive integer n and any constants C1, C2,,Cn, the process I W, defined by w,=c,,+c2Y n+ 1s station- y. First E(W)=CEY +C2Er-1.+C,Er-ntI=(c1+C2+. +cn)uy free oft. Also Cov(Wa W 1-1+…+CnY n t-n+1,C 十 11t-k+c21t-1-k nt-n+l-k n ∑∑c:Co(Y )=∑∑c free of t j=0i=0 j=0i=0 Exercise 2.9 Suppose Y,= Bo+BIt+X, where (X, is a zero mean stationary series with autocovariance function lk and Bo and PI are constants (a) Show that (Y, is not stationary but that W,=VY=Y-Y,-l is stationary. (Y) is not stationary since its mean,βo+β1t, varies with t. However,E(W)=E(Yt-Y-1)=(βo+阝1D)-(β0+β1(-1)=β1, free of t. The argument in the solution of Exercise 2. 7 shows that the covariance function for IW, is free of t (b) In general, show that if Y=u+ X, where ix, is a zero mean stationary series and u is a polynomial in t of degree d, then V"Y=v(v" Y,) is stationary for m d and nonstationary for sd Use part (a) and proceed by induction Exercise 2.10 Let (X, be a zero-mean, unit-variance stationary process with autocorrelation function Pk. Suppose that u, is a nonconstant function and that or is a positive-valued nonconstant function. The observed series is formed as y,=u +OXt (a) Find the mean and covariance function for the (Y, process Notice that Cov(X, xt-k=Corr(XnX-k since (X, has unit variance. E(Y,=E(H +OXy=u+OE(X=H, Now Cov(Y,Y,-k=Cov(H, +OX,H-k+Ot-kx-k=oot-kCov(X, xt-k)=oot-kPk Notice that Var(Y,y) (b) Show that autocorrelation function for the r process depends only on time lag Is the iY, process station ry? Corr(Y,Y-k)=0,0,-kPWlo,or-k=Pk but Y, is not necessarily stationary since E(Y,)=H, (c) Is it possible to have a time series with a constant mean and with Corr(Y,, Y, k) free of t but with (Y, not stationary?If H, is constant but a, varies with t, this will be the case Exercise 2.11 Suppose Cov(X,,X-k=Yk is free of t but that E(XI=3t (a) Is [X, stationary No since E(X, varies with t (b) Let Y,=7-3t+X, Is (Y, stationary? Yes, since the covariances are unchanged but now E(X )=7-3t+3t = free of t Exercise 2.12 Suppose that Y=er-er-12. Show that iY, is stationary and that, for k>0, its autocorrelation func tion is nonzero only for lag k= 12 E(Y=E(et -et-12=0. Also Cov(Y,Y-k=Covler-et-12, et-k-et-12-k=-Covle, -12, et -k=-(ce)- when k=12. It is nonzero only for k= 12 since, otherwise, all of the error terms involved are uncorrelated Exercise213 Let Y,=er-ee2-1. For this exercise, assume that the white noise series is normally distributed. (a) Find the autocorrelation function for Y,. First recall that for a zero-mean normal distribution E(ep1=0 and e(et-1)=3of. Then E(Y,=-eVar(et-1)=-0o which is constant in t and Var(Y)=Var(a1)+02var(a2-1)=2+02{E(a+-1)-[E(a2-1)]2 c42{302-{oc 20 Also cor(Yny1-1)=Cov(e1-021,e1-1-02)=Cov(-c21,e1-1)=-E(e31)=0 all other covariances are also zero (b)Is [Y, stationary? Yes, in fact, it is a non-normal white noise in disguise Exercise 2.14 Evaluate the mean and covariance function for each of the following processes. In each case determine whether or not the process is stationary (a)Y,=0o+te, The mean is 0o but it is not stationary since Var(Y,)=(var(e,=to2 is not free of t So the mean of W, is zero. However, Var(W)=[t+(t-1)10 2 which depends on t and W, is not stationary (c)Y=e,et-1.(You may assume that [e, is normal white noise. )The mean of Y, is clearly zero. Lag one is the only lag at which there might be correlation. However, Cov(r,,Yt 1=E(er- let- let 2) e(en ele-1"E(e-2)=0. So the process y=ee-1 is stationary and is a non-normal white noise Exercise 2.15 Suppose that X is a random variable with zero mean. Define a time series by Y,=(1)'X (a) Find the mean function for (Y,. E(Y=(1)E(X=0 (b)Find the covariance function for (Y,). Cov(Y,Y,_)=CovI(1)'X, (-1)kx]=(1)2t-kCov(X, X) (-1)(σx) (c)Is(Y,) stationary? Yes, the mean is constant and the covariance only depends on lag Exercise 2.16 Suppose Y,=A+X, where(X, is stationary and a is random but independent of (X,. Find the mean and covariance function for (Y, in terms of the mean and autocovariance function for (X, and the mean and vari ance of A First E(Y,=E(A)+ E(XD=HA + ux, free of t. Also, since (X, and A are independent Cov(r, I-k=COv(A+ X, A+X,-k=Cov(A, A)+Cov(X,X-k=var(a)+ free of t Exercise 2.17 Let [yi) be stationary with autocovariance function Yk. Let r= ial show that n n Var( Va∑=11 CovI Now make the change of variable t-s=k and t=i in the double sum The range of the summation {1sI≤n,1≤ssn} is transformed into{1≤j≤n,l≤j-ksn}={k+1≤j≤n+k,1≤jsn} which may be written{k>0,k+1≤j≤n}∪{k≤0,1≤j≤n+k},Ihus Var(r n n+k k+1 (n-k)rk+ 2k=_+ (n+ krkI Use yk=y-k to get the first expression in the exercise Exercise 2.18 Let [Y, be stationary with autocovariance function Yk. Define the sample variance as (a) First show that >(Y,-a) (y1-Y)2+n(Y-)2 t=1 ,-+P-)2=2(r1-2+(-)2+2∑(x1-F1(-) =1 (Y1-Y)2+n(-p)2+2(y-u)∑(H1-Y)=∑(X1-Y) + n (b)Use part(a)to show that E(S2 )=- Var(r) (Use the results of Exercise(2. 17)for the last expression. E(S)=En1∑ (H1-)2-n(Y-μ) ∑E[(Y1-p)21]-nE(Y-μ [ nyo -nvar( k 0 Y0-几 ∑ k (c)If (Y, is a white noise process with variance o, Show that E(S)=?o. This follows since for white noise lk 0 fork >o Exercise 2.19 Let Y1=00+e, and then for t> I define y, recursively by Y=eo+Yt-1+er. Here eo is a constant The process (, is called a random walk with drift (a)Show that y, may be rewritten as Y=teo+e,+e-1+. +el Substitute Yr-1=00+I-2+et-I into Y,=00+Y-1+er and repeat until you get back to e l (b) Find the mean function for Yr. E(Y, ) =E(teo +e, +e-1+. +e1=teo (c)Find the autocovariance function for Y 1Y1-k)=Cov[t00+e1+e;-1+…+e1,(t-k)0+e1-k+e1-1-k+…+e1 Cove er-k+e-1 Varo e1)=(1-k)o2 fort≥k Exercise 2.20 Consider the standard random walk model where Y=Y-1+e, with Y1=el (a) Use the above representation of Y, to show that H =u-I for t> l with initial condition H=E(e1=0.Hence show that u=0 for all t. Clearly, H=E(Y1=E(e1=0 Then E(Yy=E(r-1+e=e(Y-1)+ e(er= E(YI-1)or H,=ut-1 for t> 1 and the result follows by induction () Similarly, show that Var(r,y= Var(r-1)+ae, for t> I with Var(r1)=a2,and, hence Var(ry=toe Var(r1)=o is immediate. Then Var(Y,= Var(Y-1+e)=var(Y-1)+var(e,=var(Y-1)+ Recursion or induction on t yields var(Y,)=to (c)For o<t<s, use Ys=Yt+er+1+e1+2+ .+es to show that Cov(r, Ys)=var(r) and, hence, that Cov(Yr Y)=min(t,s)2.For0≤t≤s Cov(Y,y)=Cov(Y,Y1+c1+1+c1+2+…+es)=Cov(F1,Y)=Var(Y1)=o2andhencetheresut Exercise 2.21 A random walk with random starting value. Let Y,=0+e,+e-1+,.+e, for t>0 where Yo has a distribution with mean uo and variance o. Suppose further that Yo, eI, ...,e, are independent (a) Show that E(Y)=Ho for all t E(Y)=E(Yo+er+e1-1+…+e1)=E(Y0)+E(e)+E(e1-1)+…+E(e1)=E(Y0)=po (b) Show that Var(Y)=102+o2 ar(y)=Var(Y+e1+c1-1+…+e1)=var(Y1)+War(a1)+Var(e1-1)+…+Var(1)=o6+12 (c) Show that Cov(Y, Ys)=min(t, s)o2+oo. Let t be less than s. Then, as in the previous exercise, Cov(Y=Cov(Y,,Y,+e,,t t+2 +…+e)=Vr(Y1) 可2+ (d) Show that Corr(r Yo= for o<tss. Just use the results of parts(b) and(c). So“+0 Exercise 2.22 Let e,) be a zero-mean white noise process and let c be a constant with c< 1. Define Y, recursively by Y=cY-1+e, with Y1=e1 This exercise can be solved using the recursive definition of Y, or by expressing Y, explicitly using repeated substi tutinasY=c(cY1-2+et-1)+et=…=e1+cCet-1+ce1-2+…+c-e1.Parts(c),(d),and(e)essen tially assume you are working with the recursive version of y but they can also be solved using this explicit representation. (a) Show that E(YD=0. First E(Y1)=E(e1=0. Then E(Y,=CE(Y-1+e(e=cE(Yt-1 and the result follows by induction on t (b)Show that Var(r,)=o(1+c+c+.+c42 Is iY,i stationary? 2 Var(r,= var(e, +cer 2 )=2(1+c2+c4+…+c2(-1) tY, is ot stationary since var(Yt) depends on t Var(r,=var(crite=cvar(r-1) +c2=Var(c1-2+e,1)+a2 Alternatively Var(Y,-2)+∞2(1+c2) z(1+c2+c4+…+c 2(t-1 Var(Y 1 (c) Show that Corr(Y, Y-1=c N Var(h) and, in general, r r(y Corr(y for k>o r(Y,) (Hint: Argue that Y-I is independent of er. Then use Cov(Ir, Y-1)=Cov(cY,-1terYt-1 ov(Y,, Y-1)=Cov(criter)=cvar(Y-1) Var(Y-1) Var(Y So Corr(r /Var(Y,)Var(Y-1) Var(Y Cov(Y,,y-k=Cov(cr-1+er Y-k=cOv(cr-2+er Cov( r-2,Y Cov(cr3+ 251t ar(Y-k So Corry,Y coVar(Y-k Var( t-k as required Wvar(Y,)Var(Y-k Var(r) (d) For large t, argue that Var(Y1)≈ and Cor(Y,Y,k)≈ ck for k>0 so that (Y,) could be called asymptotically stationary. These two results follow from parts(b)and(c) (e)Suppose now that we alter the initial condition and put Y,=e/(1-c2) Show that now (, is station- ary. This part can be solved using repeated substitution to express Y explicitly as t-2 +c Then show that Var(Y 1-c and Corr(Y,, Y, 1 ork> Exercise 2.23 Two processes (Z, and [Y, are said to be independent if for any time points t1, t2, tm and SI S2,...,Sn, the random variables( z,, Z t2,.,z,)are independent of the random variables(y si Show that if iZt and rti are independent stationary processes, then WI=Zt+ Y is stationary First, E(W,=E(Z,+ E(r,=uz+uy Then Cov(W,wt-k=cov(z,+Y,, Ztk+Y-k)=cov(z, z,k)+ Cov(r,It- k)which is free of t since both(Z, and (r, are stationary Exercise 2.24 Let [, be a time series in which we are interested. However, because the measurement process itself iS not perfect, we actually observe Y,=X,+er. We assume that (X, and e, are independent processes. We call, the signal and e, the measurement noise or error process If ix, is stationary with autocorrelation function Pk, Show that r, is also stationary with Corr(r,, Y-k fork≥1 1+σ2/ 6 We call ot/o2 the signal-to-noise ratio, or SNR. Note that the larger the snR. the closer the autocorrelation function of the observed process (r, is to the autocorrelation function of the desired signal (X, First, E(Y)=E(X,)+ E(e,=Hx free of t. Next, for k2 l, Cov(Y,It-k=Cov(x+er, XI-k+et-k=Cov(XnX-k Cov(e, etk=Cov(Xn xi-k=var(Xppk which is free of t. Finally, Cov(nnr Var(X,)p Corr(r, y-k) k ⅩPk fork≥ var(r Var(X, )+02 0x+02 1+02/ox Exercise 2.25 Suppose Y,= po LA cos(2If;, t)+B, sin (2f t) where Po,fi, f2,,fk are constants and A l A2,, Ak, B1, B2,, B are independent random variables with zero means and variances Var(Ai= var(B)=oi Show that Y, is stationary and find its covariance function. Compare this exercise with the results for the random Cosine wave on page 18.First E(Y)=阝o+∑[E(A)cos(2m1)+E(B1)sin(27)]=P Next using the independence of A, A2,, Ak, Bl, B2, Bk and some trig identities we have Cov(Y,, Y,)=Cov [A cos(2f D+B; Sin(2T/ D], [A, cos(2 Tf; S)+B, sin(2f; s) CoviA OS((2T/, 2), A cos(2T/; S))+ 2coviBi sin(2Tf 2 ),B; sin(2Tf, S)) Icos(2tf: ()cos(2tf; s)) Var(A )+isin(2if; t)sin(2f; s Var(Bi) 2 Icos(2f (t-s))+ cos(2nf(+S)))2+icos(2Tf; (t-s))-cos(2Tf; (t s))) =∑c0s(2f(t-s) hence the process is stationary Exercise 2.26 Define the functionI,= DE[(Y, -Y )2]. In geostatistics, T,s is called the semivariogram (a)Show that for a stationary process I , s =No/t-s. Without loss of generality, we may assume that the station nary process has a zero mean Then El(Y-Y]=JElY2-2Y, s+ E+2L」-22y,」=0-t (b)a process is said to be intrinsically stationary if Ts depends only on the time difference t-s Show that the random walk process is intrinsically stationary For the random walk for t>s we have Y=e,+eit,+e so that (e,+ )-(e。+ ∴…+e t-1 )and TLS=SEI(Y-Y)2]=Var(Y, -Y )=Var(e, +e-1+.+es+)=5(t-s)a2 as required Exercise 2.27 For a fixed, positive integer r and constant o, consider the time series defined by ert oer +…+de, (a)Show that this process is stationary for any value of q. The mean is clearly zero and 7 Cov(Y,Y1-k)=Cov(e1+q1-1+中2e1-2+…+e1p1k+qe1-k-1+02e1k-2+…+ψe,-k-r Cv(e1+…+￠e,-k++1e,-k-1 φ t-r't-k +oet-k-1 +…+pet-k-r kx小k+2,Ak+4 十 4k+2(r-k) (1+φ2+φ4+….+中 (b) Find the autocorrelation function We have ar(y)=var(e,+pe-1+ φcr,-)=(1+q2+φ4+…+2)c Corr(y (1+φ2+φ4+…+2) The results in parts(a) and(b)can be simplified for o* l and separately for 1 Exercise 2.28(Random cosine wave extended) Suppose that RcoS(2π(ft+①))fort=0,±1, where 0<f<2 is a fixed frequency and R and op are uncorrelated random variables and with o uniformly dis tributed on the interval(0, 1) (a) Show that E(Y,=0 for all t. E(Y, )=EfRCOS(2T(I+d)))=E(RE(coS (2T([+ p)), But Eicos (2t+))i=0 with a calculation entirely similar to the one on page 18 (b)Show that the process is stationary with Yk= 2E(R+)cos(2Tfk) Yk=E[R2cos(2r((t-k)+④)cos(2m(f(t-k)+①)=E(R2)E[cos(2(t+)cos(2π((t-k)+Φ) and then use the calculations leading up to Equation(2.3. 4), page 19 to show that E[cos(2π(ft+①)cos(2π(f(t-k)+Φ))]=cos(2mfk) Exercise 2. 29(Random cosine wave extended more) Suppose that R;cOS[2π(;t+Φ, fort=0,±1,±2, where0≤/1<2<…<fm<% are m fixed frequencies,R1,Φ1,R2,Φ2,,Fm,Φ are uncorrelated random vari ables and with each p; uniformly distributed on the interval (0, 1) a) Show that E(r)=0 for all t (b)Show that the process is stationary with Yk-22E(R?)cosf k) Parts(a)and(b) follow directly from the solution of Exercise(2. 28)using the independence Exercise 2.30(Mathematical statistics required) Suppose that Y=Rcos[2r(t+)fort=0,土1,±2 where r and dp are independent random variables and f is a fixed frequency. The phase d is assumed to be uni formly distributed on (0, 1), and the amplitude r has a Rayleigh distribution with pdf f(r)=re-/ 4/2 for r>0 Show that for each time point t, y, has a normal distribution. (Hint: Let Y =Rcos [2(t +p) and X Rsin [2(ft ap). Now find the joint distribution of X and Y. It can also be shown that all of the finite dimensional distributions are multivariate normal and hence the process is strictly stationary For fixed t and f consider the one-to-one transformation defined by Y=RCos[2π(+Φ)],X=Rsin[2π(ft+Φ)] The range for(X, r)will be(0<X<o0,0<Y<oo). Also X+y=R<. Furthermore, OX OX OR Od=cos [2T(+ ()] 2TRsin[2TUt+o)] and the Jacobian is -2IR=-2ITx2+y2 or ar Lsin[2T(t+D)] 2TRcos [2(t+p)1 RΦ with inverse Jacobian 1/(-2ITX2+y). The joint density for R and is f(r,o )=re- /2 for 0<rand 0< d <l. Hence the joint density for X and y is given by 8 x2+y2e(x2+y2)2c-xk/2ey2/2 f(x,y) for-∞<x<∞,-<y< oo as required 2兀Ax2+y 2π2 CHAPTER 3 Exercise 3.1 Verify Equation (3.3.2), page 30, for the least squares estimates of Bo and of pi when the model Y=po Bit+x, is considered. This is a standard calculation in many first courses in statistics usually using calculus. Here we give an algebraic solution. Without loss of generality, we assume that Y, and t have each been standardized so that y=∑t=0and(12=∑=n-1. Then we have Q(p1)=∑[x-(po+1)12 n9+∑IH2+B2∑t2-2∑Y+201∑1-21∑Y n6+∑[Y12+B 2β +(n-1)(1+阝)-2β +(n-1)+(n-1) ∑Y|-(n-1) This is clearly smallest when 0 and When these results are translated back to(unstandardized) original terms, we obtain the usual ordinary least squares regression results. In addition, by looking at the minimum value of e we have Q(0B1)=(n-1)(1-r2) where r is the correlation coefficient between Y and t. Since Q>0, this also provides a proof that correlations are always between-I and +l Exercise 3.2 Suppose Y=H+er -e-1. Find Var(Y). Note any unusual results. In particular, compare your answer to what would have been obtained if y,=H +e,. (Hint: You may avoid Equation(3. 2. 3), page 28, by first doing some algebraic simplification on s(er-et-1)) +"=1(e;-2-1)=u+(n-=0)s0var(Y) ar(e The denominator of n- is very unusual. We expect a denominator of n in the variance of a sample mean. The nega tive autocorrelation at lag one makes it easier to estimate the process mcan when compared with estimating the mean of a white noise process Exercise 3.3 Suppose Y=H+e, +et-l. Find Var(r). Compare your answer to what would have been obtained if Y, u+er. Describe the effect that the autocorrelation in Y, has on Var(r) ∑ Var(r) 2(2n If Y,=H+e, we would have Var(r)=(1/no but in our present case Var(r)a(4/n)o, approximately four times larger. The positive autocorrelation at lag one makes it more difficult to estimate the process mean compared with estimating the mean of a white noise process 9 Exercise 3. 4 The data file hours contains monthly values of the average hours worked per week in the U.s. manufac- turing sector for July 1982 through June 1987 (a) Display the time series plot for these data and interpret D寸。寸o9oo寸po。o 尺y 1983 1984 1985 1986 1987 Time data(hours); plot (hours, ylab='Monthly Hours ', type=o') he plot displays an upward"trend, in the first half of the series. However, there is certainly no distinct pattern in e display. (b) Now construct a time series plot that uses separate plotting symbols for the various months. Does your inter pretation change from that in part(a)? FM M 84~6 1983 1984 1985 1987 >plot(hours, lab='Monthly Hours, type=l' points(y=hours, x=time(hours), pch=as vector(season(hours))) The most distinct pattern in this plot is that Decembers are nearly always high months relative the others. Decen bers stick out 10

...展开详情

weixin_38744375 可以可以，很值得，可以好好学习参考了
2020-04-23

xiashiwendao 这个积分50有点狠了。。。可能是因为作者当年上传的时候还是很少；现在很多这个下载只需要很少的积分
2018-12-22

The_Matrix_ 很好！原来下过一次，这次再次下载看看
2018-04-24

aoydp 完美解决问题！
2017-11-22

skywalkeryee 答案很详细，非常感谢
2015-07-09

val_zhang 好书，可惜书里的习题都没答案，这个资源太有用了，虽然是英文了但是不影响理解
2015-06-13

zhangywlfh 很适用啊，学习啦
2014-09-10

muwatuo 资源挺好，可惜是英文的
2014-04-30

Kevinkl 如果是中文版的就更好了
2014-04-28

mars02008 太好了，十分感谢，英文也很简单了啊
2014-04-21

1/127

44积分/C币 立即下载