Week 1
Autocorrelation
The correlation between 𝑦! and lagged values of the variable itself, 𝑦!"#
First-order autocorrelation
Because 𝑦"! and 𝑦"!"$ , and the sample variances are almost the same, we can compute the
∑$
!%&('! "'
()('!"# "'()
first-order autocorrelation as 𝜌$$ = ∑$ ()&
!%#('! "'
𝑘!* order autocorrelation
The 𝑘!* order autocorrelation is 𝜌$# = 𝛾$# /𝛾$+ , where 𝛾$# is an estimate of the 𝑘!* order
$
autocovariance, that is, 𝛾$# = , ∑,!-#.$(𝑦! − 𝑦")(𝑦!"# − 𝑦"). The set of all autocorrelations 𝜌$#
for 𝑘 = 1,2, … is called the empirical autocorrelation function (EACF)
White noise
A time series 𝜀! is white noise if it has the following three properties:
- 𝐸(𝜀! ) = 0 𝑡 = 1,2, … , 𝑇
/) /
- 𝐸(𝜀! = 𝜎 𝑡 = 1,2, … , 𝑇
- 𝐸(𝜀0 𝜀! ) = 0 𝑠, 𝑡 = 1,2, … , 𝑇 and 𝑠 ≠ 𝑡
Information set
The information set at time 𝑡 − 1, 𝒴!"$ = {𝑦$ , 𝑦/ , … , 𝑦!"$ }, is the available history of a time
series up to time 𝑡 − 1 is
Conditional distribution
The conditional distribution is 𝑓(𝑦! |𝒴!"$ ). If we know 𝑔(∙) and also the values of the
parameters 𝜃 isn the general time series model. 𝑦! = 𝑔A𝑦!"$ , 𝑦!"/ , … , 𝑦!"1 ; 𝜃C + 𝜀! , then the
conditional distribution 𝑓(𝑦! |𝒴!"$ ) of 𝑦! is the same as the distribution of 𝜀!
First order autoregressive model
The first order autoregressive (𝐴𝑅(1)) model is given by 𝑦! = 𝜙$ 𝑦!"$ + 𝜀! ,
𝑡 = 1,2, … , 𝑇. So, 𝑦! = 𝜙$! 𝑦+ + 𝜙$!"$ 𝜀$ + ⋯ + 𝜙$ 𝜀!"$ + 𝜀! = 𝜙$! 𝑦+ + ∑!"$ 2
2-+ 𝜙$ 𝜀!"2 , where 𝑦+
is a pre-sample starting value
Effect types
When |𝜙$ | < 1, 𝜙$2 → 0 as 𝑖 increases, then the shock 𝜀!"2 has a transitory effect on the time
series 𝑦! . When 𝜙$ exceeds 1, the effect of shocks 𝜀!"2 on 𝑦! increases with 𝑖, then the time
series is called explosive. When 𝜙$ = 1, we have 𝑦! = 𝑦+ + ∑!"$ 2-+ 𝜀!"2 , where 𝜀!"2 has the
same impact on all observations 𝑦!"2.* , ℎ = 0,1, … . Shocks are said to have permanent
effects
(Un)conditional mean
Consider the 𝐴𝑅(1) model with |𝜙$ | < 1. Given that 𝜀! is a white noise series with
𝐸(𝜀! |𝒴!"$ ) = 𝐸(𝜀! ) = 0, the conditional mean of 𝑦! is equal to 𝐸(𝑦! |𝒴!"$ ) = 𝜙$ 𝑦!"$ .
The unconditional mean of the time series is 𝐸(𝑦! ) = 𝜇 = 𝜙$! 𝑦+ and as 𝑡 → ∞, we find
𝐸(𝑦! ) = 0
Intercepts
We can change the (unconditional) mean to nonzero by including an intercept 𝛿 in the model,
3
so 𝑦! = 𝛿 + 𝜙$ 𝑦!"$ + 𝜀! = 𝛿 ∑!"$ 2 ! !"$ 2
2-+ 𝜙$ + 𝜙$ 𝑦+ + ∑2-+ 𝜙$ 𝜀!"2 . As 𝑡 → ∞, 𝐸(𝑦! ) = $"4 . We
#
have that 𝛿 = 𝜇(1 − 𝜙$ ), so 𝑦! − 𝜇 = 𝜙$ (𝑦!"$ − 𝜇) + 𝜀!
Stationarity
The 𝐴𝑅(1) model can be written as 𝑦! = 𝜀! + 𝜋$ 𝜀!"$ + 𝜋/ 𝜀!"/ + ⋯, where 𝜋 = 𝜙$2 and 𝜀! is
a white noise time series. If |𝜙$ | < 1, 𝜋2 → 0 when 𝑖 increases, then the 𝐴𝑅(1) model is called
stationary. It means that the unconditional mean, unconditional variance and
autocorrelations of 𝑦! are constant over time. In 𝐴𝑅(1) model, |𝜙$ | < 1 is a necessary and
sufficient condition for stationarity
, Correlation 𝑦! and shocks
It holds that
- 𝐸(𝑦! 𝜀! ) = 𝜎 /
- 𝐸A𝑦! 𝜀!.5 C = 0 for 𝑗 = 1, 2, …
- 𝐸(𝑦! 𝜀!"# ) ≠ 0 for 𝑘 = 1, 2, …
Variance
The variance of 𝑦! is 𝛾+ = 𝐸 [(𝑦! − 𝐸 [𝑦! ])(𝑦! − 𝐸 [𝑦! ])]. If we assume that 𝐸(𝑦! ) = 0 or
6&
𝛿 = 0, then 𝛾+ = $"4& when |𝜙$ | < 1. Since 𝑉(𝑦! |𝒴!"$ ) = 𝜎 / , we have that the larger |𝜙$ |,
#
the larger 𝛾+ becomes relative to 𝑉(𝑦! |𝒴!"$ )
Autocovariance
The first order autocovariance for an 𝐴𝑅(1) time series is 𝛾$ = 𝜙$ 𝛾+ . The 𝑘!* order
autocovariance of time series 𝑦! is 𝛾# = 𝐸 [(𝑦! − 𝐸 [𝑦! ])(𝑦!"# − 𝐸 [𝑦!"# ])] =
𝐸 [𝜙$ (𝑦!"$ − 𝜇)(𝑦!"# − 𝜇)] = 𝜙$ 𝛾#"$ , for 𝑘 = ⋯ , −2, −1, 0, 1, 2, …
Autocorrelation function (ACF)
7 4 7
The 𝑘!* order autocorrelation of 𝑦! is 𝜌# = 7' = #7'"# = 𝜙$ 𝜌#"$ , so 𝜌# = 𝜙$# . The
( (
autocorrelations of an 𝐴𝑅(1) model with |𝜙$ | < 1, thus decline exponentially towards zero.
It holds that 𝜌+ = 1 and 𝜌"# = 𝜌# for 𝑘 = 1, 2, ….
Autocorrelation with unit roots
The unit root case with 𝜙$ = 1 gives 𝑦! = 𝑦!"$ + 𝜀! . Then we have 𝐸(𝑦! ) = 0,
!"#
𝛾+,! = 𝐸(𝑦!/ ) = 𝑡𝜎 / and 𝛾#,! = 𝐸(𝑦! 𝑦!"# ) = (𝑡 − 𝑘)𝜎 / , so 𝜌#,! = ! , 𝑘 > 0. When 𝑡
becomes large, all (theoretical) autocorrelations 𝜌#,! become equal to 1
Effect sign 𝜙$
When 0 < 𝜙$ < 1, all correlations are positive and decline monotonically. When
−1 < 𝜙$ < 0, all even correlations are positive, all odd correlations are negative, and decline
monotonically towards zero
Lag operator
The so-called lag operator 𝐿 is defined by 𝐿# 𝑦! = 𝑦!"# for 𝑘 = ⋯ , −2, −1, 0, 1, 2, …. 𝐿 can be
used in products and ratios, and in adding and subtracting operations
𝐴𝑅(𝑝) model
If we include 𝑝 lagged variables, we get 𝑦! = 𝜙$ 𝑦!"$ + ⋯ + 𝜙1 𝑦!"1 + 𝜀! . This is an
autoregressive model or order 𝑝. This can be written as 𝜙1 (𝐿)𝑦! = 𝜀! , where 𝜙1 (𝐿) is the AR-
polynomial in 𝐿 of order 𝑝: 𝜙1 (𝐿) = 1 − 𝜙$ 𝐿 − ⋯ − 𝜙1 𝐿1 . The characteristic polynomial is
𝜙1 (𝐿), but with 𝑧 filled in. Roots of this polynomial determine whether the effects of shocks
are transitory or permanent
Roots
The characteristic polynomial of the 𝐴𝑅(1) model is given by 𝜙$ (𝑧) = 1 − 𝜙$ 𝑧, and its root
is 𝑧 = 𝜙$"$ . When 𝜙$ = 1, this solution equals 1, and in that case the AR(1) polynomial is said
to have a unit root (and shocks have permanent effects). When |𝜙$ | < 1, the root of (43)
exceeds 1 (and shocks have transitory effects). Since higher order 𝐴𝑅(𝑝) models have
complex roots, the solution to 𝜙$ (𝑧) = 1 − 𝜙$ 𝑧 is said to be “outside the unit circle” when
|𝜙$ | < 1
Moving average (MA) model
The MA model of order 𝑞, 𝑀𝐴(𝑞), is 𝑦! = 𝜀! + 𝜃$ 𝜀!"$ + ⋯ + 𝜃9 𝜀!"9 . We may rewrite any
𝐴𝑅(𝑝) model in MA form, 𝑦! = 𝜙1 (𝐿)"$ 𝜀! . In an 𝑀𝐴(2) model the variance equals
𝛾+ = (1 + 𝜃$/ + 𝜃// )𝜎 / . For an 𝑀𝐴(𝑞) model it holds that 𝛾# = A∑9"# /
2-+ 𝜃2 𝜃2.# C𝜎 for
𝑘 = 0, 1, … , 𝑞 and 𝛾# = 0 for 𝑘 > 𝑞, with 𝜃+ = 1