For the pdf slides, click here
Best linear predictor
Goal: find a function of that gives the “best” predictor of .
- We mean “best” by achieving minimum mean squared error
- Under joint normality assumption of and , the best estimator is
Best linear predictor
- For Gaussian processes, and are the same.
- The best linear predictor only depends on the mean and ACF of the series
Properties of ACVF and ACF
for all
is a even function, i.e., for all
A function is nonnegative definite if for all and vectors
Theorem: a real-value function defined on the integers is the autocovariance function of a stationary time series if and only if it is even and nonnegative definite
ACF has all above properties of ACVF
- Plus one more:
Linear Processes
Linear processes: definitions
A time series is a linear process if where , and the constants satisfy
Equivalent representation using backward shift operator
Special case: moving average MA
Linear processes: properties
In the linear process definition, the condition ensures
- The infinite sum converges with probability 1
- , and hence converges in mean square, i.e., is the mean square limit of the partial sum
Apply a linear filter to a stationary time series, then the output series is also stationary
Theorem: let be a stationary time series with mean 0 and ACVF . If , then the time series is stationary with mean 0 and ACVF
Special case of the above result: If is a linear process, then its ACVF is
Combine multiple linear filters
- Linear filters with absoluately summable coefficients can be applied successively to a stationary series to generate a new stationary series or equivalently,
AR process , in linear process formats
If , then
- Since only depends on , we say is causal or future-independent
If , then
- This is because
- Since depends on , we say is noncausal
If , then there is no stationary linear process solution
Introduction to ARMA Processes
ARMA process
ARMA process: definitions
The time series is a ARMA process if it is stationary and satisfies where and
Equivalent represention using the backward shift operator
ARMA process in linear process format
If , by letting , we can write an ARMA as
If , then , and
If , then , and
If , then there is no such stationary ARMA process
Invertibility
- Invertibility is the dual concept of causaility
- Causal: can be expressed by
- Invertible: can be expressed by
- For an ARMA process,
- If , then it is invertible
- If , then it is noninvertible
Properties of the Sample ACVF and Sample ACF
Estimation of the series mean
The sample mean is an unbised estimator of
- Mean squared error
Theorem: If is a stationary time series with mean 0 and ACVF , then as ,
Confidence bounds of
If is Gaussian, then
- For many common time series, such as linear and ARMA models, when is large,
is approximately normal:
- An approximate 95% confidence interval for is
- To estimate , we can use
Estimation of ACVF and ACF
- Use sample ACVF and sample ACF
- Even if the factor is replaced by , they are still biased
- They are nearly unbiased for large
When is slightly smaller than , the estimators are unreliable since there are only few pairs of .
- A useful guide for them to be reliable (by Jenkins):
Bartlett’s Formula
Asymptotic distribution of
For linear models, esp ARMA models, when is large, is approximately normal
By Bartlett’s formula, is the covariance matrix with entries
- Special cases
Marginally, for any ,
iid noise
Forecast Stationary Time Series
Best linear predictor: minimizes MSE
Best linear predictor: definition
For a stationary time series with known mean and ACVF , our goal is to find the linear combination of that forecasts with minimum mean squared error
Best linear predictor:
- We need to find the coefficients that minimize
- We can take partial derivatives and solve a system of equations
Best linear predictor: the solution
Plugging the solution in, the linear pedictor becomes
- The solution of coefficients
- and
Best linear predictor : properties
Unbiasness
Mean squared error (MSE)
Orthogonality
- In general, orthogonality means
Example: one-step prediction of an AR series
We predict from
The coefficients satisfies
By guessing, we find a solution , i.e.,
- Does not depend on
- MSE
WOLG, we can assume while predicting
A stationary time series has mean
To predict its future values, we can first create another time series and predict by
Since ACVF , the coefficients are the same for and
The best linear predictor for is
Prediction operator
and are random variables with finte 2nd moments
- Note: does not need to be stationary
Best linear predictor:
Coefficients satisfies where and
Properties of
Unbiased
Orthogonal for
MSE
Linear
- Extreme cases
- Perfect prediction
- Uncorrelated: if for all , then
Examples: predictions of AR series
A time series is an autoregression of order , i.e., AR, if it is stationary and satisfies where , and for all
When , the one-step prediction of an AR series is with MSE
-step prediction of an AR series (proof by recursions)
Recursive methods: the Durbin-Levinson and Innovation Algorithms
Recursive methods for one-step prediction
The best linear predictor solution needs matrix inversion
Alternatively, we can use recursion to simplify one-step prediction of , based on for
- We will introduce
- Durbin-Levinson algorithms: good for AR
- Innovation algorithm: good for MA; innovations are uncorrelated
Durbin-Levinson algorithm
- Assume is mean zero, stationary, with ACVF
- Start with and
For , compute step 2-4 successively
Compute (partial autocorrelation function (PACF) at lag )
Compute
Compute
Innovation algorithm
Assume is any mean zero (not necessarily stationary) time series with covariance
Predict based on innovations, or one-step prediction errors ,
- Start with and
For , compute step 2-3 successively
For , compute coefficients
Compute the MSE
-step predictors using innovations
For any , orthoganlity ensures Thus, we have
The -step prediction:
References
- Brockwell, Peter J. and Davis, Richard A. (2016), Introduction to Time Series and Forecasting, Third Edition. New York: Springer