Book Notes: Introduction to Time Series and Forecasting -- Ch3 ARMA Models

For the pdf slides, click here

ARMA(p,q) Processes

ARMA(p,q) process: definitions

  • {Xt} is an ARMA(p,q) process if it is stationary, and for all t, Xtϕ1Xt1ϕpXtp=Zt+θ1Zt1++θqZtq where {Zt}WN(0,σ2) and the polynomials ϕ(z)=1ϕ1zϕpzp,θ(z)=1+θ1z++θqzq have no common factors

  • Equivalent formula using the backward shift operator ϕ(B)Xt=θ(B)Zt

  • An ARMA(p,q) process with mean μ: we can study {Xtμ} (Xtμ)ϕ1(Xt1μ)ϕp(Xtpμ)=Zt+θ1Zt1++θqZtq

Stationary solution

Stationary solution: existence and uniqueness

  • A stationary solution exists and is unique if and only if ϕ(z)0,for all complex z with |z|=1

  • The unit circle: the region in zC defined by |z|=1

  • Stationary solution: Xt=θ(B)/ϕ(B)Zt=ψ(B)Zt=j=ψjZtj

Causality

Causality: ϕ(z) has no zeros inside the unit circle

  • An ARMA(p,q) process is causal: if there exist ψ0,ψ1, j=0|ψj|<,and Xt=j=0ψjZtj,for all t

  • Theorem (equivalent condition of causaility): ϕ(z)=1ϕ1zϕpzp0,for all |z|1

  • Example: ARMA(1,1) XtϕXt1=Zt+θZt1 1ϕz=0 only zero z=1/ϕ So |z|=1/|ϕ|>1, i.e., |ϕ|<1 is equivalent of causality

How do we get ψj’s?

  • Letting θ0=1 and matching coefficients of zj based on 1+θ1z+θqzq=(1ϕ1zϕpzp)(ψ0+ψ1z+), gives θj1[jq]=ψjj=1pϕkψjk,j=0,1,

  • Example: causal ARMA(1,1)
    1=ψ0θ=ψ1ϕψ0ψ1=θ+ψ0=ψjϕψj1 for j2ψj=ϕψj1

Therefore, ψ0=1,ψj=ϕj1(θ+ψ) for j1

Invertibility

Invertibility: θ(z) has no zeros inside the unit circle

  • An ARMA(p,q) process is invertible: if there exist π0,π1, j=0|πj|<,and Zt=j=0πjXtj,for all t

  • Theorem (equivalent condition of invertibility): θ(z)=1+θ1z++θqzq0,for all |z|1

  • Example: ARMA(1,1) XtϕXt1=Zt+θZt1 1+θz=0 only zero z=1/θ So |z|=1/|θ|>1, i.e., |θ|<1 is equivalent of invertibility

How do we get πj’s?

  • Letting ϕ0=1 and matching coefficients of zj based on 1ϕ1zϕpzp=(1+θ1z+θqzq)(π0+π1z+), gives ϕj1[jp]=πj+j=1qθkπjk,j=0,1,

  • Example: invertible ARMA(1,1)
    1=π0ϕ=π1+θψ0π1=(ψ+θ)0=πj+θπj1 for j2πj=θπj1

Therefore, π0=1,πj=(1)jθj1(ψ+θ) for j1

ACF and PACF of an ARMA(p,q) Process

Calculation of the ACVF

Calculation of the ACVF

  • Assume the ARMA(p,q) process {Xt} is causal and invertible

  • Method 1: If Xt=j=0ψjZtj, then γ(h)=E(Xt+hEt)=σ2j=0ψjψj+|h|

  • Method 2 (difference equation method): multiple the ARMA formula with Xt,Xt1, and take expectation

Example: ARMA(1,1)

  • Recall that for a causal ARMA(1,1), in Xt=j=0ψjZtj, ψ0=1,ψj=ϕj1(θ+ψ) for j1

  • Lag-0 autocorrelation γ(0)=σ2j=0ψj2=σ2[1+(θ+ϕ)2j=0ϕ2j]=σ2[1+(θ+ϕ)21ϕ2]

  • Lag-1 autocorrelation γ(1)=σ2j=0ψjψj+1=σ2[θ+ϕ+(θ+ϕ)2ϕ1ϕ2]

  • Lag-k autocorrelation (k2) γ(k)=ϕk1γ(1),k2

Use the difference equation method on ARMA(1,1)

  1. Multiple XtϕXt1=Zt+θZt1 by Xt, then take expectation E(Xt2)ϕE(XtXt1)=E(XtZt)+θE(XtZt1) Since E(XtZk)=E[(j=0ψjZtj)Zk]=ψtkσ2, we have γ(0)ϕγ(1)=σ2+θ(θ+ϕ)σ2

  2. Multiply by Xt1 E(Xt1Xt)ϕE(Xt12)=E(Xt1Zt)+θE(Xt1Zt1) γ(1)ϕγ(0)=0+θσ2ψ0=θσ2

Using the two equations from 1 and 2, we can solve γ(0),γ(1)

  1. Multiply by Xtk, for k2 E(XtkXt)ϕE(XtkXt1)=E(XtkZt)+θE(XtkZt1) γ(k)ϕγ(k1)=0γ(k)=ϕγ(k1)

Test for MAs and ARs from the ACF and PACF

ACF of an MA(q) process

  • Suppose {Xt} is an MA(q), then ρ(h)=0 for all h>q
  • By asymptotic normality ρ^(q+1)N(0,wq+1,q+1n) and Bartlett wq+1,q+1=k=1[ρ(k+q+1)+ρ(kq1)2ρ(k+1)ρ(q)]2=k=1ρ(kq1)2=1+2j=1qρ(j)2

Test for an MA(q): from the ACF

  1. Hypotheses H0:{Xt}MA(q)HA:Not H0

  2. Test statistic Z=ρ^(q+1)01+2j=1qρ^(j)2n

  3. Reject H0 if |Z|zα/2

  • Note: under the null hypothesis, we use the sample ACF plot with bounds ±1.96×1+2j=1qρ^(j)2n to check if ρ^(h) for all hq+1 are inside the bounds. But this may have some multiple testing problems.

Partial autocorrelation function (PACF)

  • We define the partial autocorrelation function (PACF) of an ARMA process as the function α() α(0)=1,α(h)=ϕhh, for h1 Here, ϕhh is the last entry of ϕh=Γh1γh,where Γh=[γ(ij)]i,j=1h, γh=[γ(1),,γ(h)]

  • Sample PACF α^(): change all γ() above to γ^()

  • Recall: in DL algorithm X^n+1=ϕn1Xn++ϕnnX1, ϕnn=α(n),PACF at lag n

PACF property

  • ϕnn is the correlation between the prediction errors α(n)=Corr(XnP(Xn|X1,,Xn1),X0P(X0|X1,,Xn1))

  • Theorem: A stationary series is AR(p) if and only if α(h)=0 for all h>p

  • If {Xt} is an AR(p), then we have asymptotic normality α^(h)N(0,1n),for all h>p

Test for an AR(p): from the PACF

  1. Hypotheses H0:{Xt}AR(p)HA:Not H0

  2. Test statistic Z=α^(p+1)01n

  3. Reject H0 if |Z|zα/2

  • Note: under the null hypothesis, we use the sample PACF plot with bounds ±1.96/n to check if α^(h) for all hp+1 are inside the bounds. But this may have some multiple testing problems.

Forecast ARMA Processes

Forecast ARMA(p,q) using the innovation algorithm

  • Let m=max(p,q)

  • One-step prediction X^n+1={j=1nθnj(Xn+1jX^n+1j),n<mi=1pϕiXn+1i+j=1qθnj(Xn+1jX^n+1j),nm

    • Special case: AR(p) process X^n+1=i=1pϕkXn+1i,np
  • h-step prediction: for n>m and all h1, PnXn+h=i=1pϕiPnXn+hi+j=hqθn+h1,j(Xn+1jX^n+1j)

Innovation algorithm parameters vs MA parameters

  • Innovation algorithm parameters converges to the MA parameters: If {Xt} is invertible, then as n, θnjθj,j=1,2,,q

  • Prediction MSE converges to σ2: Let vn=E(Xn+1X^n+1)2,and vn=rnσ2 If {Xt} is invertible, then as n, rn1

References

  • Brockwell, Peter J. and Davis, Richard A. (2016), Introduction to Time Series and Forecasting, Third Edition. New York: Springer