For the pdf slides, click here
Objective of time series models
Seasonal adjustment: recognize seasonal components and remove them to study long-term trends
Separate (or filter) noise from signals
Prediction
Test hypotheses
Predicting one series from observations of another
A general approach to time series modeling
- Plot the series and check main features:
- Trend
- Seasonality
- Any sharp changes
- Outliers
- Remove trend and seasonal components to get stationary residuals
- May need data transformation first
- Choose a model to fit the residuals
Stationary Models and Autocorrelation Function
Definitions: stationary
- Series has
- Mean function and
- Covariance function
- is (weakly) stationary if
- does not depend on
- does not depend on , for each
- (Weakly) stationary is defined based on the first and second order properties of a series
- is strictly stationary if and
have the same joint distributions for all
integers and
- If is strictly stationary, and for all , then is weakly stationary
- Weakly stationary does not imply strictly stationary
Definitions: autocovariance and autorrelation
is a stationary time series
Autocovariance function (ACVF) of at lag
- Autocorrelation function (ACF) of at lag
- Note that and
Definitions: sample ACVF and sample ACF
are observations of a time series with sample mean
Sample autocovariance function: for ,
- Use in the denominator ensures the sample covariance matrix is nonnegative definite
- Sample autocorrelation function: for ,
- Sample correlation matrix is also nonnegative definite
Examples of Simple Time Series Models
iid noise and white noise
- White noise: uncorrelated, with zero mean and variance
- IID sequences is , but not conversely
Binary process and random walk
Binary process: an example of iid noise
Random walk: , with and iid noise
is a simple symmetric random walk if is a binary process with
Random walk is not stationary: if , then depends on
First-order moving average, MA process
Let , and , then is a MA process:
ACVF: does not depend on , stationary
ACF:
First-order autoregression, AR process
Let , and , then is a AR process:
ACVF:
ACF:
Estimate and Eliminate Trend and Seasonal Components
Classcial decomposition
Observation can be decomposed into
- a (slowly changing) trend component ,
- a seasonal component with period and ,
a zero-mean series
Method 1: estimate first, then , and hope the noise component is stationary (to model)
Method 2: differencing
Method 3: trend and seasonality can be estimated together in a regression, whose design matrix contains both polynomial and harmonic terms
Trend Component Only
Estimate trend: polynomial regression fitting
Observation can be decomposed into a trend component and a zero-mean series :
- Least squares polynomial regression
Estimate trend: smoothing with a finite MA filter
Linear filter
Two-sided moving average filter, with
for , if only has the trend component but not seasonality , and is approximately linear in
is a low-pass filter: remove the rapidly fluctuating (high frequency) component , and let the slowly varying component pass
Estimate trend: exponential smoothing
For any fixed , the one-sided MA defined by recursions
- Equivalently,
Eliminate trend by differencing
Backward shift operator
- Lag-1 difference operator
- If is applied to a linear trend function , then
- Powers of operators and :
- reduces a polynomial trend of degree to a constant
Also with the Seasonal Component
Estimate seasonal component: harmonic regression
Observation can be decomposed into a seasonal component and a zero-mean series :
: a periodic function of with period , i.e.,
Harmonic regression: a sum of harmonics (or sine waves)
Unknown (regression) parameters:
- Specified parameters:
- Number of harmonics:
- Frequencies , each being some integer multiple of
- Sometimes are instead specified through Fourier indices
Estimate trend and seasonal components
Estimate : use a MA filter chosen to elimate the seasonality
- If is odd, let
- If is even, let and
Estimate : for each
- Compute the average
- To ensure , let , where
Re-estimate : based on the deseasonalized data
Eliminate trend and seasonal components: differencing
Lag- differencing
- Note: the operators and are different
Apply to
- Then the trend can be eliminated using methods discussed before, e.g., applying a power of the operator
Test Whether Estimated Noises are IID
Test series for iid: sample ACF based
Test name | Test statistic | Distribution under |
---|---|---|
Sample ACF | , for all | |
Portmanteau |
Under , about 95% of the sample ACFs should fall between
- The Portmanteau test has some refinements
- Ljung and Box
- McLeod and Li , where is the sample ACF of squared data
Test series for iid: also detect trends
Test name | Test statistic | Distribution under |
---|---|---|
Turning point | : number of turning points | |
Difference-sign | : number of that |
- Time is a turning point, if and have
flipped signs
- A large positive (or negative) value of indicates increasing (or
decreasing) trend
Test series for iid: other methods
- Fitting an AR model
- Using Yule-Walker algorithm and choose order using AICC statistic
- If the selected order is zero, then the series is white noise
Normal qq plot: check of normality
A general strategy is to check all above mentioned tests, and proceed with caution if any of them suggests not iid
References
- Brockwell, Peter J. and Davis, Richard A. (2016), Introduction to Time Series and Forecasting, Third Edition. New York: Springer