For the pdf slides, click here
Parameter estimation for ARMA
- When the orders are known, estimate the parameters
- There are parameters in total
- Preliminary estimations
- Yule-Walker and Burg’s algorithm: good for AR
- Innovation algorithm: good for MA
- Hannan-Rissanen algorithm: good for ARMA
More efficient estimation: MLE
- When the orders are unknown, use model selection methods to
select orders
- Minimize one-step MSE: FPE
- Penalized likelihood methods: AIC, AICC, BIC
Yule-Walker Estimation
Yule-Walker equations
is a casual AR process
Multiplying each side by , respectively, and taking expectation, we got the Yule-Walker equations
Vector representation
Yule-Walker estimator and its properties
Yule-Walker estimators are obtained by solving the hatted version of the Yule-Walker equations
The fitted model is causal and
Asymptotic normality
Yule-Walker estimator is a moment estimator: because it is obtained by equating theoretical and sample moments
Usually moment estimators have much higher variance than MLE
But Yule-Walker estimators of AR process have the same asymptotic distribution as the MLE
Moment estimators can fail for MA and general ARMA
- For example, MA: with . Moment estimator of is obtained by solving This can yield complex if , which can happen if , i.e.,
Innovations algorithm: estimate MA coefficients
Fitted innovations MA model where and are from the innovations algorithm with ACVF replaced by the sample ACVF
For a MA process, the innovations algorithm estimator is NOT consistent for
Choice of : increase until the vector stabilizes
Maximum Likelihood Estimation
Likelihood function of a Gaussian time series
Suppose is a Gaussian time series with mean zero
Assume that covariance matrix is nonsingular
One-step predictors using innovations algorithm: and with MSE
- Example: AR
Likelihood function
Maximum likelihood estimation of ARMA
Innovations MSE , where depends on and
Maximizing the likelihood is equivalent to minimizing where
MLE can be expressed with MLE
MLE are obtained by minimizing Not depend on !
Asymptotic normality of MLE
When is large, for a causal and invertible ARMA process,
For an AR process, MLE has the same asymptotic distribution as the Yule-Walker estimator
Examples of
AR
AR
MA
MA
ARMA
Order Selection
Order selection
Why? Harm of using too large to fit models:
- Large errors arising from parameter estimation of the model
- Large MSEs of forecasts
FPE: only for AR processes
AIC: for ARMA; approximate Kullback-Leibler discrepancy of the fitted model and the true model, a penalized likelihood method
AICC: for ARMA; a bias-corrected version of AIC, a penalized likelihood method
Diagnostic Checking
Residuals and rescaled residuals
- Residuals of an ARMA process
- Residuals should be similar to white noises
- Rescaled residuals
- Residuals residuals should be approximately
Residual diagnostics
Plot and look for patterns
- Compute the sample ACF of
- It should be close to the sample ACF
Apply Chapter 1 tests for IID noises
References
- Brockwell, Peter J. and Davis, Richard A. (2016), Introduction to Time Series and Forecasting, Third Edition. New York: Springer