Econometrics 2 - Lecture 3 Univariate Time Series Models Contents nTime Series nStochastic Processes nStationary Processes nThe ARMA Process nDeterministic and Stochastic Trends nModels with Trend nUnit Root Tests nEstimation of ARMA Models March 23, 2018 Hackl, Econometrics 2, Lecture 3 •2 Private Consumption March 23, 2018 Hackl, Econometrics 2, Lecture 3 •3 Private consumption in the EURO area (16 mem- bers), quarterly data, seasonally adjusted, AWM database (in MioEUR) Private Consumption: Growth Rate March 23, 2018 Hackl, Econometrics 2, Lecture 3 •4 Yearly growth of private consumption in EURO area (16 members), AWM database (in MioEUR) Mean growth: 15.008 Disposable Income March 23, 2018 Hackl, Econometrics 2, Lecture 3 •5 Disposable income, Austria (in Mio EUR) Time Series March 23, 2018 Hackl, Econometrics 2, Lecture 3 •6 Time-ordered sequence of observations of a random variable Examples: nAnnual values of private consumption nYearly changes in expenditures on private consumption nQuarterly values of personal disposable income nMonthly values of imports Notation: nRandom variable Y nSequence of observations Y1, Y2, ... , YT nDeviations from the mean: yt = Yt – E{Yt} = Yt – μ Components of a Time Series March 23, 2018 Hackl, Econometrics 2, Lecture 3 •7 Components or characteristics of a time series are nTrend nSeasonality nIrregular fluctuations Time series model: represents the characteristics as well as possible interactions Purpose of modelling nDescription of the time series nForecasting the future Example: Quarterly observations of the disposable income Yt = βt + ΣiγiDit + εt with Dit = 1 if t corresponds to i-th quarter, Dit = 0 otherwise Contents nTime Series nStochastic Processes nStationary Processes nThe ARMA Process nDeterministic and Stochastic Trends nModels with Trend nUnit Root Tests nEstimation of ARMA Models n n n n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •8 Hackl, Econometrics 2, Lecture 3 •9 Stochastic Process nTime series: realization of a stochastic process nStochastic process is a sequence of random variables Yt, e.g., n {Yt, t = 1, ..., n} n {Yt, t = -∞, ..., ∞} nJoint distribution of the Y1, ... , Yn: n p(y1, …., yn) nOf special interest nEvolution of the expectation mt = E{Yt} over time nDependence structure over time n nExample: Extrapolation of a time series as a tool for forecasting March 23, 2018 Hackl, Econometrics 2, Lecture 3 •10 White Noise nWhite noise process {Yt, t = -∞, ..., ∞} nE{Yt} = 0 nV{Yt} = σ² nCov{Yt,Yt-s} = 0 for all (positive or negative) integers s ni.e., a mean zero, serially uncorrelated, homoskedastic process March 23, 2018 Hackl, Econometrics 2, Lecture 3 •11 AR(1)-Process nStates the dependence structure between consecutive observations as n Yt = δ + θYt-1 + εt, |θ| < 1 n with εt: white noise, i.e., V{εt} = σ² (see next slide) nAutoregressive process of order 1 nFrom Yt = δ + θYt-1 + εt = δ + θδ + θ²δ +… +εt + θεt-1 + θ²εt-2 +… follows n E{Yt} = μ = δ(1-θ)-1 n|θ| < 1 needed for convergence! Invertibility condition nIn deviations from μ, yt = Yt – m: n yt = θyt-1 + εt March 23, 2018 Hackl, Econometrics 2, Lecture 3 •12 AR(1)-Process, cont’d nAutocovariances γk = Cov{Yt,Yt-k} nk=0: γ0 = V{Yt} = θ²V{Yt-1} + V{εt} = … = Σi θ2i σ² = σ²(1-θ²)-1 nk=1: γ1 = Cov{Yt,Yt-1} = E{ytyt-1} = E{(θyt-1+εt)yt-1} = θV{yt-1} n = θσ²(1-θ²)-1 nIn general: n γk = Cov{Yt,Yt-k} = θkσ²(1-θ²)-1, k = 0, ±1, … n depends upon k, not upon t! March 23, 2018 Hackl, Econometrics 2, Lecture 3 •13 MA(1)-Process nStates the dependence structure between consecutive observations as n Yt = μ + εt + αεt-1 n with εt: white noise, V{εt} = σ² nMoving average process of order 1 n E{Yt} = μ nAutocovariances γk = Cov{Yt,Yt-k} nk=0: γ0 = V{Yt} = σ²(1+α²) nk=1: γ1 = Cov{Yt,Yt-1} = ασ² nγk = 0 for k = 2, 3, … nDepends upon k, not upon t! n n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •14 AR-Representation of MA-Process nThe AR(1) can be represented as MA-process of infinite order n yt = θyt-1 + εt = Σ∞i=0 θi εt-i n given that |θ| < 1 nSimilarly: the AR representation of the MA(1) process n yt = αyt-1 – α²yt-2 ± … + εt = Σ∞i=0 (-1)i αi+1yt-i-1 + εt n given that |α| < 1 March 23, 2018 Contents nTime Series nStochastic Processes nStationary Processes nThe ARMA Process nDeterministic and Stochastic Trends nModels with Trend nUnit Root Tests nEstimation of ARMA Models n n n n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •15 Hackl, Econometrics 2, Lecture 3 •16 Stationary Processes nRefers to the joint distribution of Yt’s, in particular to the second moments n(Weak) stationary or covariance stationary process: the first two moments are finite and not affected by a shift of time n E{Yt} = μ for all t n Cov{Yt, Yt+k} = γk, k = 0, ±1, … for all t and all k n Cov{Yt, Yt+k}, k = 0, ±1,…: covariance function; γt,k = γt,-k nA process is called strictly stationary if its stochastic properties are unaffected by a change of the time origin nThe joint probability distribution at any set of times is not affected by an arbitrary shift along the time axis n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •17 AC and PAC Function nAutocorrelation function (AC function, ACF) nIndependent of the scale of Y nFor a stationary process: n ρk = Corr{Yt,Yt-k} = γk/γ0, k = 0, ±1,… nProperties: q|ρk| ≤ 1 qρk = ρ-k qρ0 = 1 nCorrelogram: graphical presentation of the AC function nPartial autocorrelation function (PAC function, PACF): n θkk = Corr{Yt, Yt-k|Yt-1,...,Yt-k+1}, k = 0, ±1, … nθkk is obtained from Yt = θk0 + θk1Yt-1 + ... + θkkYt-k nPartial correlogram: graphical representation of the PAC function March 23, 2018 Hackl, Econometrics 2, Lecture 3 •18 Examples nfor the AC and PAC functions: nWhite noise n ρ0 = θ00 = 1 n ρk = θkk = 0, if k ≠ 0 n white noise is stationary nAR(1) process, Yt = δ + θYt-1 + εt n ρk = θk, k = 0, ±1,… n θ00 = 1, θ11 = θ, θkk = 0 for k > 1 nMA(1) process, Yt = μ + εt + αεt-1 n ρ0 = 1, ρ1 = α/(1 + α2), ρk = 0 for k > 1 n PAC function: damped exponential if α > 0, alternating and damped exponential if α < 0 n n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •19 Stationarity of MA- and AR- Processes nMA processes are stationary nWeighted sum of white noises nE.g., MA(1) process: Yt = μ + εt + αεt-1 n ρ0 = 1, ρ1 = α/(1 + α2), ρk = 0 for k > 1 nAn AR process is stationary if it is invertible nAR(1) process, Yt = θYt-1 + εt = Σ∞i=0 θi εt-i if |θ| < 1 (invertibility condition) n ρk = θk, k = 0, ±1,… n n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •20 AC and PAC Function: Estimates nEstimator for the AC function ρk: n n n nEstimator for the PAC function θkk: coefficient of Yt-k in the regression of Yt on Yt-1, …, Yt-k March 23, 2018 Hackl, Econometrics 2, Lecture 3 AR(1) Processes, Verbeek, Fig. 8.1 March 23, 2018 •21 Hackl, Econometrics 2, Lecture 3 MA(1) Processes, Verbeek, Fig. 8.2 March 23, 2018 •22 Contents nTime Series nStochastic Processes nStationary Processes nThe ARMA Process nDeterministic and Stochastic Trends nModels with Trend nUnit Root Tests nEstimation of ARMA Models n n n n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •23 Hackl, Econometrics 2, Lecture 3 •24 The ARMA(p,q) Process nGeneralization of the AR and MA processes: ARMA(p,q) process n yt = θ1yt-1 + … + θpyt-p + εt + α1εt-1 + … + αqεt-q n with white noise εt nLag (or shift) operator L (Lyt = yt-1, L0yt = Iyt = yt, Lpyt = yt-p) nARMA(p,q) process in operator notation n θ(L)yt = α(L)εt n with operator polynomials θ(L) and α(L) n θ(L) = I - θ1L - … - θpLp n α(L) = I + α1L + … + αqLq n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •25 Lag Operator nLag (or shift) operator L nLyt = yt-1, L0yt = Iyt = yt, Lpyt = yt-p nAlgebra of polynomials in L like algebra of variables nExamples: n(I - ϕ1L)(I - ϕ2L) = I – (ϕ1+ ϕ2)L + ϕ1ϕ2L2 n(I - θL)-1 = Σ∞i=0θi Li nMA(∞) representation of the AR(1) process n yt = (I - θL)-1εt n the infinite sum defined only (e.g., finite variance) if |θ| < 1 nMA(∞) representation of the ARMA(p,q) process n yt = [θ (L)]-1α(L)εt n similarly the AR(∞) representations; invertibility condition: restrictions on parameters March 23, 2018 Hackl, Econometrics 2, Lecture 3 •26 Invertibility of Lag Polynomials nInvertibility condition for lag polynomial θ(L) = I - θL: |θ| < 1 nInvertibility condition for lag polynomial of order 2, θ(L) = I - θ1L - θ2L2 nθ(L) = I - θ1L - θ2L2 = (I - ϕ1L)(I - ϕ2L) with ϕ1+ϕ2 = θ1 and -ϕ1ϕ2 = θ2 nInvertibility conditions: both (I – ϕ1L) and (I – ϕ2L) invertible; |ϕ1| < 1, |ϕ2| < 1 nInvertibility in terms of the characteristic equation n θ(z) = (1- ϕ1z) (1- ϕ2z) = 0 nCharacteristic roots: solutions z1, z2 from (1- ϕ1z) (1- ϕ2z) = 0 n z1 = ϕ1-1, z2 = ϕ2-1 nInvertibility conditions: |z1| = |ϕ1-1| > 1, |z2| = |ϕ2-1| > 1 nPolynomial θ(L) is not invertible if any solution zi fulfills |zi| ≤ 1 nCan be generalized to lag polynomials of higher order n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •27 Unit Root and Invertibility nLag polynomial of order 1: θ(z) = (1- θz) = 0, nUnit root: characteristic root z = 1; implies θ = 1 nInvertibility condition |θ| < 1 is violated, AR process Yt = θYt-1 + εt is non-stationary nLag polynomial of order 2 nCharacteristic equation θ(z) = (1- ϕ1z) (1- ϕ2z) = 0 nCharacteristic roots zi = 1/ϕi, i = 1, 2 nUnit root: a characteristic root zi of value 1; violates the invertibility condition |z1| = |ϕ1-1| > 1, |z2| = |ϕ2-1| > 1 nAR(2) process Yt is non-stationary nAR(p) process: polynomial θ(z) = 1 - θ1z - … - θpLp, evaluated at z = 1, is zero, given Σiθi = 1: Σiθi = 1 indicates a unit root nTests for unit roots are important tools for identifying stationarity March 23, 2018 Contents nTime Series nStochastic Processes nStationary Processes nThe ARMA Process nDeterministic and Stochastic Trends nModels with Trend nUnit Root Tests nEstimation of ARMA Models n n n n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •28 Hackl, Econometrics 2, Lecture 3 •29 Types of Trend nTrend: The development of the expected value of a process over time; typically an increasing (or decreasing) pattern nDeterministic trend: a function f(t) of the time, describing the evolution of E{Yt} over time n Yt = f(t) + εt, εt: white noise n Example: Yt = α + βt + εt describes a linear trend of Y; an increasing trend corresponds to β > 0 nStochastic trend: Yt = δ + Yt-1 + εt or n ΔYt = Yt – Yt-1 = δ + εt, εt: white noise qdescribes an irregular or random fluctuation of the differences ΔYt around the expected value δ qAR(1) – or AR(p) – process with unit root q“random walk with trend” n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •30 Example: Private Consumption nPrivate consumption, AWM database; level values (PCR) and first differences (PCR_D) n n n n n n n n n nMean of PCD_D: 3740 March 23, 2018 Hackl, Econometrics 2, Lecture 3 •31 Trends: Random Walk and AR Process nRandom walk: Yt = Yt-1 + εt; random walk with trend: Yt = 0.1 +Yt-1 + εt; AR(1) process: Yt = 0.2 + 0.7Yt-1 + εt; εt simulated from N(0,1) n March 23, 2018 • • • • • • • •-12 •-8 •-4 •0 •4 •8 •12 •16 •20 •10 •20 •30 •40 •50 •60 •70 •80 •90 •100 • • •random walk • • • • • • • • • •random walk with trend • • • • • • • • • • • • • • • • • •AR(1) process, δ=0.2, θ=0.7 • • • • • • • • • • • Hackl, Econometrics 2, Lecture 3 •32 Random Walk with Trend nThe random walk with trend Yt = δ + Yt-1 + εt can be written as n Yt = Y0 + δt + Σi≤t εi n δ: trend parameter nComponents of the process nDeterministic growth path Y0 + δt nCumulative errors Σi≤t εi nProperties: nExpectation Y0 + δt is depending on Y0, i.e., on the origin (t=0)! nV{Yt} = σ²t becomes arbitrarily large! nCorr{Yt,Yt-k} = √(1-k/t) nRandom walk with trend is non-stationary! March 23, 2018 Hackl, Econometrics 2, Lecture 3 •33 Random Walk with Trend, cont’d nFrom Corr{Yt,Yt-k} = √(1-k/t) follows nFor fixed k,Yt and Yt-k are the stronger correlated, the larger t nWith increasing k, correlation tends to zero, but tends the slower to zero the larger t (long memory property) nRandom walk vs. the AR(1) process Yt = δ + θYt-1 + εt nAR(1) process: εt-i has the lesser weight, the larger i nAR(1) process similar to random walk when θ is close to one March 23, 2018 • Hackl, Econometrics 2, Lecture 3 •34 Non-Stationarity: Consequences nAR(1) process Yt = θYt-1 + εt nOLS estimator for θ: n n n nFor |θ| < 1: the estimator is qconsistent qasymptotically normally distributed nFor θ = 1 (unit root) qθ is underestimated qestimator not normally distributed qspurious regression problem March 23, 2018 Hackl, Econometrics 2, Lecture 3 •35 Integrated Processes nIn order to cope with non-stationarity nTrend-stationary process: the process can be transformed in a stationary process by subtracting the deterministic trend qE.g., Yt = f(t) + εt with white noise εt: Yt–f(t) = εt is stationary nDifference-stationary process, or integrated process: stationary process can be achieved by differencing qE.g.,Yt = δ + Yt-1 + εt, E.g., Yt–Yt-1 = δ + εt is stationary nIntegrated process: stochastic process Y is called nintegrated of order one if the first difference yield a stationary process: Y ~ I(1) nintegrated of order d, if the d-fold differences yield a stationary process: Y ~ I(d) March 23, 2018 Hackl, Econometrics 2, Lecture 3 •36 I(0)- vs. I(1)-Processes nI(0) process, e.g.,Yt = δ + εt nFluctuates around the process mean with constant variance qMean-reverting qLimited memory nI(1) process e.g.,Yt = δ + Yt-1 + εt = Y0 + δt + Σi≤t εi nFluctuates widely qInfinitely long memory qPersistent effect of shocks March 23, 2018 Hackl, Econometrics 2, Lecture 3 •37 Integrated Stochastic Processes nMany economic time series show stochastic trends nFrom the AWM Database n n n n n n n n n nARIMA(p,d,q) process: d-th differences follow an ARMA(p,q) process March 23, 2018 Variable d YER GDP, real 1 PCR Consumption, real 1-2 PYR Household's Disposable Income, real 1-2 PCD Consumption Deflator 2 Contents nTime Series nStochastic Processes nStationary Processes nThe ARMA Process nDeterministic and Stochastic Trends nModels with Trend nUnit Root Tests nEstimation of ARMA Models n n n n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •38 Hackl, Econometrics 2, Lecture 3 •39 Example: Model for a Stochastic Trend nData generation: random walk (without trend): Yt = Yt-1 + εt, εt: white noise nRealization of Yt: is a non-stationary process, stochastic trend nV{Yt}: a multiple of t nSpecified model: Yt = α + βt + εt nDeterministic trend nConstant variance nMiss-specified model! nConsequences for OLS estimator for β nt- and F-statistics: wrong critical limits, rejection probability too large nR2 indicates explanatory potential although Yt a random walk without trend n“spurious regression” or “nonsense regression” March 23, 2018 Hackl, Econometrics 2, Lecture 3 •40 White Noise and Random Walk nComputer-generated random numbers neps: white noise, i.e., N(0,1)-distributed nY: random walk n Yt = Yt-1 + epst March 23, 2018 Hackl, Econometrics 2, Lecture 3 •41 Random Walk and Deterministic Trend nFitting the deterministic trend model Yt = α + βt + εt to the random walk data results in -0.92 +0.096 t with t-statistic 19.77 for b, R2 = 0.66, and Durbin Watson statistic 0.066 March 23, 2018 Hackl, Econometrics 2, Lecture 3 •42 How to Model Trends? nSpecification of a nDeterministic trend, e.g., Yt = α + βt + εt: risk of spurious regression, wrong decisions nStochastic trend: analysis of differences ΔYt if a random walk, i.e., a unit root, is suspected nConsequences of spurious regression are more serious nConsequences of modeling differences ΔYt: nAutocorrelated errors nConsistent estimators nAsymptotically normally distributed estimators nHAC correction of standard errors, i.e., heteroskedasticity and autocorrelation consistent estimates of standard errors March 23, 2018 Hackl, Econometrics 2, Lecture 3 •43 Elimination of Trend nRandom walk Yt = δ + Yt-1 + εt with white noise εt n ΔYt = Yt – Yt-1 = δ + εt nΔYt is a stationary process nA random walk is a difference-stationary or I(1) process nLinear trend Yt = α + βt + εt nSubtracting the trend component α + βt provides a stationary process nYt is a trend-stationary process March 23, 2018 Contents nTime Series nStochastic Processes nStationary Processes nThe ARMA Process nDeterministic and Stochastic Trends nModels with Trend nUnit Root Tests nEstimation of ARMA Models n n n n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •44 Hackl, Econometrics 2, Lecture 3 •45 Unit Root Tests nAR(1) process Yt = δ + θYt-1 + εt with white noise εt nDickey-Fuller or DF test (Dickey & Fuller, 1979) n Test of H0: θ = 1 against H1: θ < 1, i.e., H0 states Y ~ I(1), Y is non-stationary nKPSS test (Kwiatkowski, Phillips, Schmidt & Shin, 1992) n Test of H0: θ < 1 against H1: θ = 1, i.e., H0 states Y ~ I(0), Y is stationary nAugmented Dickey-Fuller or ADF test n extension of DF test nVarious modifications like Phillips-Perron test, Dickey-Fuller GLS test, etc. n n n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •46 Dickey-Fuller‘s Unit Root Test nAR(1) process Yt = δ + θYt-1 + εt with white noise εt nOLS Estimator for θ: n n nTest statistic n n nDistribution of DF nIf |θ| < 1: approximately t(T-1) nIf θ = 1: Dickey & Fuller critical values nDF test for testing H0: θ = 1 against H1: θ < 1 nθ = 1: characteristic equation 1 – θz = 0 has unit root March 23, 2018 Hackl, Econometrics 2, Lecture 3 •47 Dickey-Fuller Critical Values nMonte Carlo estimates of critical values for n DF0: Dickey-Fuller test without intercept; Yt = θYt-1 + εt n DF: Dickey-Fuller test with intercept; Yt = δ + θYt-1 + εt n DFτ: Dickey-Fuller test with time trend; Yt = δ + γt + θYt-1 + εt March 23, 2018 T p = 0.01 p = 0.05 p = 0.10 25 DF0 -2.66 -1.95 -1.60 DF -3.75 -3.00 -2.63 DFτ -4.38 -3.60 -3.24 100 DF0 -2.60 -1.95 -1.61 DF -3.51 -2.89 -2.58 DFτ -4.04 -3.45 -3.15 N(0,1) -2.33 -1.65 -1.28 Hackl, Econometrics 2, Lecture 3 •48 Unit Root Test: The Practice nAR(1) process Yt = δ + θYt-1 + εt with white noise εt n can be written with π = θ -1 as n ΔYt = δ + πYt-1 + εt nDF tests H0: π = 0 against H1: π < 0 n test statistic for testing π = θ -1 = 0 identical with DF statistic n n nTwo steps: 1.Regression of ΔYt on Yt-1: OLS-estimator for π = θ - 1 2.Test of H0: π = 0 against H1: π < 0 based on DF; critical values of Dickey & Fuller March 23, 2018 Hackl, Econometrics 2, Lecture 3 •49 Example: Price/Earnings Ratio nVerbeek’s data set PE: annual time series data on composite stock price and earnings indices of the S&P500, 1871-2002 nPE: price/earnings ratio qMean 14.6 qMin 6.1 qMax 36.7 qSt.Dev. 5.1 nlog(PE) qMean 2.63 qMin 1.81 qMax 3.60 qSt.Dev. 0.33 March 23, 2018 Hackl, Econometrics 2, Lecture 3 •50 Price/Earnings Ratio, cont’d nFitting an AR(1) process to the log(PE) data gives: n ΔYt = 0.335 – 0.125Yt-1 n with t-statistic -2.569 (for Yt-1) and p-value 0.1021 np-value of the DF statistic (-2.569): 0.102 q1% critical value: -3.48 q5% critical value: -2.88 q10% critical value: -2.58 nH0: θ = 1 (non-stationarity) cannot be rejected for log(PE) nUnit root test for first differences: ΔΔYt = 0.008 – 0.935ΔYt-1, DF statistic -10.59, p-value 0.000 (1% critical value: -3.48) nlog(PE) is I(1) nHowever: for sample 1871-1990: DF statistic -3.65, p-value 0.006; within the period 1871-1990, log(PE) is stationary March 23, 2018 Hackl, Econometrics 2, Lecture 3 •51 Unit Root Test: Extensions nDF test so far for a model with intercept: ΔYt = δ + πYt-1 + εt nTests for alternative or extended models nDF test for model without intercept: ΔYt = πYt-1 + εt nDF test for model with intercept and trend: ΔYt = δ + γt + πYt-1 + εt nDF tests in all cases H0: π = 0 against H1: π < 0 nTest statistic in all cases n n nCritical values depend on cases; cf. Table on slide 47 March 23, 2018 Hackl, Econometrics 2, Lecture 3 •52 KPSS Test nProcess Yt = βt + (rt + α) + εt, with deterministic time trend βt, a random walk rt = rt-1 + ut with white noise ut with variance σu2, r0 = α serving as intercept, and white noise error term εt nTest of H0: σu2 = 0, i.e., Yt is trend stationary, or Yt-βt is stationary, against H1: σu2 > 0 nH0 implies a unit moving average root in the ARMA representation of ΔYt nKPSS (Kwiatkowski, Phillips, Schmidt, Shin) test statistic n n n with St = Σti=1 ei and the variance estimate s2 of the residuals et from the regression Yt = δ + βt + εt nReject H0 for large values of KPSS nCritical values from Monte Carlo simulations March 23, 2018 Hackl, Econometrics 2, Lecture 3 •53 ADF Test nExtended model according to an AR(p) process: n ΔYt = δ + πYt-1 + β1ΔYt-1 + … + βpΔYt-p+1 + εt nExample: AR(2) process Yt = δ + θ1Yt-1 + θ2Yt-2 + εt can be written as n ΔYt = δ + (θ1+ θ2 - 1)Yt-1 – θ2ΔYt-1 + εt n the characteristic equation (1 - ϕ1L)(1 - ϕ2L) = 0 has roots θ1 = ϕ1 + ϕ2 and θ2 = - ϕ1ϕ2 n a unit root implies ϕ1 = θ1+ θ2 =1: nAugmented DF (ADF) test nTest of H0: π = 0 , i.e., Y ~ I(1), against H1: π < 0 nFor choice of p: information criterion, e.g, AIC nExtensions (intercept, trend) similar to the DF-test nPhillips-Perron test: alternative method; uses HAC-corrected standard errors March 23, 2018 Hackl, Econometrics 2, Lecture 3 •54 ADF-GLS Test nVariant of the Dickey–Fuller test nThe variable to be tested is assumed to have na non-zero mean or na linear trend nDe-meaning or de-trending nGLS procedure suggested by Elliott, Rothenberg and Stock (1996) nADF-GLS test has higher power than the ADF test March 23, 2018 Hackl, Econometrics 2, Lecture 3 •55 Price/Earnings Ratio, cont’d nExtended model according to an AR(2) process gives: n ΔYt = 0.366 – 0.136Yt-1 + 0.152ΔYt-1 - 0.093ΔYt-2 n with t-statistics -2.487 (Yt-1), 1.667 (ΔYt-1) and -1.007 (ΔYt-2) and n p-values 0.014, 0.098 and 0.316 np-value of the DF statistic (-2.487): 0.119 q1% critical value: -3.48 q5% critical value: -2.88 q10% critical value: -2.58 nNon-stationarity cannot be rejected for log(PE) nUnit root test for first differences: DF statistic -7.31, p-value 0.000 (1% critical value: -3.48) nlog(PE) is I(1) nHowever: for sample 1871-1990: DF statistic -3.52, p-value 0.009 March 23, 2018 Hackl, Econometrics 2, Lecture 3 •56 Unit Root Tests in GRETL nFor marked variable: nVariable > Unit root tests > Augmented Dickey-Fuller test n Performs the qDF test (choose zero for “Lag order for ADF test”) or the qADL test qwith or without constant, trend, squared trend nVariable > Unit root tests > ADF-GLS test n Performs the qDF test (choose zero for “Lag order for ADF test”) or the qADL test qDe-meaning or de-trending using GLS nVariable > Unit root tests > KPSS test n Performs the KPSS test with or without a trend March 23, 2018 Contents nTime Series nStochastic Processes nStationary Processes nThe ARMA Process nDeterministic and Stochastic Trends nModels with Trend nUnit Root Tests nEstimation of ARMA Models n n n n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •57 Hackl, Econometrics 2, Lecture 3 •58 ARMA Models: Application nApplication of the ARMA(p,q) model in data analysis: Three steps 1.Model specification, i.e., choice of p, q (and d if an ARIMA model is specified) 2.Parameter estimation 3.Diagnostic checking March 23, 2018 Hackl, Econometrics 2, Lecture 3 •59 Estimation of ARMA Models nThe estimation methods nOLS estimation nML estimation nAR models: Yt = δ + θ1Yt-1 + θ2Yt-2 + … + θpYt-p + εt nExplanatory variables are lagged values of the explained variable nUncorrelated with error term nOLS estimation gives consistent estimators March 23, 2018 Hackl, Econometrics 2, Lecture 3 •60 MA Models: OLS Estimation nMA models nMinimization of sum of squared deviations is not straightforward nE.g., for an MA(1) model, S(μ,α) = Σt[Yt - μ - αΣj=0(- α)j(Yt-j-1 – μ)]2 qS(μ,α) is a nonlinear function of parameters qNeeds Yt-j-1 for j=0,1,…, i.e., historical Ys, s < 0 nApproximate solution from minimization of n S*(μ,α) = Σt[Yt - μ - αΣj=0t-2(- α)j(Yt-j-1 – μ)]2 nNonlinear minimization, grid search (over -1 < α < 1) nEstimators for μ and α: consistent and asymptotically normal nARMA models combine AR part with MA part n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •61 ML Estimation nAssumption of normally distributed εt nLog likelihood function, conditional on initial values n log L(α,θ,μ,σ²) = - [(T-1)/2] log(2πσ²) – (2σ²)-1 Σt εt² n εt are functions of the parameters and of the past of y nAR(1): εt = yt - θ1yt-1 nMA(1): εt = Σj=0t-1(- α)jyt-j nInitial values: y1 for AR, ε0 = 0 for MA nConditional (on initial values) ML estimators: identical to OLS estimators nExtension to exact ML estimator nAgain, estimation for AR models easier nARMA models combine AR part with MA part nApproximation of MA- and ARMA-model by AR model of high order March 23, 2018 Hackl, Econometrics 2, Lecture 3 •62 Model Specification nBased on estimates of nAutocorrelation function (ACF) nPartial Autocorrelation function (PACF) nStructure of AC and PAC functions typical for AR and MA processes nExample: nMA(1) process: ρ0 = 1, ρ1 = α/(1-α²); ρi = 0, i = 2, 3, …; θkk = αk, k = 0, 1, … nAR(1) process: ρk = θk, k = 0, 1,…; θ00 = 1, θ11 = θ, θkk = 0 for k > 1 nEmpirical ACF and PACF give indications on the process underlying the time series n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •63 ARMA(p,q)-Processes n Condition for AR(p) θ(L)Yt = εt MA(q) Yt = α(L) εt ARMA(p,q) θ(L)Yt=α(L) εt Stationarity roots zi of θ(z)=0: |zi| > 1 always stationary roots zi of θ(z)=0: |zi| > 1 Invertibility always invertible roots zi of α(z)=0: |zi| > 1 roots zi of α(z)=0: |zi| > 1 AC function damped, infinite rk = 0 for k > q damped, infinite PAC function θkk = 0 for k > p damped, infinite damped, infinite March 23, 2018 Hackl, Econometrics 2, Lecture 3 •64 Estimated AC and PAC Functions nEstimation of the AC and PAC functions nAC ρk: n n n nPAC θkk: coefficient of Yt-k in regression of Yt on Yt-1, …, Yt-k nMA(q) process: standard errors for rk, k > q, from n √T(rk – ρk) → N(0, vk) n with vk = 1 + 2ρ1² + … + 2ρk² ntest of H0: ρ1 = 0, i.e., model is MA(0): compare √Tr1 with critical value from N(0,1), etc. nAR(p) process: test of H0: ρk = 0 for k > p based on asymptotic distribution n March 23, 2018 Hackl, Econometrics 2, Lecture 3 •65 Diagnostic Checking nARMA(p,q): Adequacy of choices p and q nAnalysis of residuals from fitted model: nCorrect specification: residuals are realizations of white noise nBox-Ljung Portmanteau test: for a ARMA(p,q) process n n n follows the Chi-squared distribution with K-p-q df nOverfitting nStarting point: a model chosen based on sample AC and PAC functions nComparison with a model with further parameters: test significance of the additional parameters March 23, 2018 Hackl, Econometrics 2, Lecture 3 •66 Example: Price/Earnings Ratio nData set PE: PE = price/earnings nlog(PE) qMean 2.63 qMin 1.81 qMax 3.60 qStd 0.33 nlog(PE) ~ I(1) March 23, 2018 Hackl, Econometrics 2, Lecture 3 •67 PE Ratio: AC and PAC Function n n n n n n n n n nSample ACF and PACF of log(PEt), n the relative change of PEt, March 23, 2018 At level 0.05, significant values are §ACF: k = 4 §PACF: k = 2, 4 possibly MA(4) (ACFk=0 if k>4) or AR(4) PE Ratio: MA (4) Model nMA(4) model for differences Δyt = Δlog(PEt) = log(PEt) – log(PEt-1), LOGPE = log(PE) March 23, 2018 Hackl, Econometrics 2, Lecture 3 •68 •Function evaluations: 37 •Evaluations of gradient: 11 • •Model 2: ARMA, using observations 1872-2002 (T = 131) •Estimated using Kalman filter (exact ML) •Dependent variable: d_LOGPE •Standard errors based on Hessian • • coefficient std. error t-ratio p-value • ------------------------------------------------------- • const 0,00804276 0,0104120 0,7725 0,4398 • theta_1 0,0478900 0,0864653 0,5539 0,5797 • theta_2 -0,187566 0,0913502 -2,053 0,0400 ** • theta_3 -0,0400834 0,0819391 -0,4892 0,6247 • theta_4 -0,146218 0,0915800 -1,597 0,1104 • •Mean dependent var 0,008716 S.D. dependent var 0,181506 •Mean of innovations -0,000308 S.D. of innovations 0,174545 •Log-likelihood 42,69439 Akaike criterion -73,38877 •Schwarz criterion -56,13759 Hannan-Quinn -66,37884 PE Ratio: AR(4) Model nAR(4) model for differences Δyt = Δlog(PEt) = log(PEt) – log(PEt-1) March 23, 2018 Hackl, Econometrics 2, Lecture 3 •69 Function evaluations: 36 Evaluations of gradient: 9 Model 3: ARMA, using observations 1872-2002 (T = 131) Estimated using Kalman filter (exact ML) Dependent variable: d_LOGPE Standard errors based on Hessian coefficient std. error t-ratio p-value ------------------------------------------------------- const 0,00842210 0,0111324 0,7565 0,4493 phi_1 0,0601061 0,0851737 0,7057 0,4804 phi_2 -0,202907 0,0856482 -2,369 0,0178 ** phi_3 -0,0228251 0,0853236 -0,2675 0,7891 phi_4 -0,206655 0,0850843 -2,429 0,0151 ** Mean dependent var 0,008716 S.D. dependent var 0,181506 Mean of innovations -0,000315 S.D. of innovations 0,173633 Log-likelihood 43,35448 Akaike criterion -74,70896 Schwarz criterion -57,45778 Hannan-Quinn -67,69903 Hackl, Econometrics 2, Lecture 3 •70 PE Ratio: Various Models nDiagnostics for various competing models: Δyt = log(PEt) – log(PEt-1) nBest fit for nBIC: MA(2) model Δyt = 0.008 + et – 0.250 et-2 nAIC: AR(2,4) model Δyt = 0.008 – 0.202 Δyt-2 – 0.211 Δyt-4 + et March 23, 2018 Model Lags AIC BIC Q12 p-value MA(4) 1-4 -73.389 -56.138 5.03 0.957 AR(4) 1-4 -74.709 -57.458 3.74 0.988 MA 2, 4 -76.940 -65.440 5.48 0.940 AR 2, 4 -78.057 -66.556 4.05 0.982 MA 2 -76.072 -67.447 9.30 0.677 AR 2 -73.994 -65.368 12.12 0.436 Hackl, Econometrics 2, Lecture 3 •71 Time Series Models in GRETL nVariable > Unit root tests > (a) Augmented Dickey-Fuller test, (b) ADF-GLS test, (c) KPSS test a)DF test or ADF test with or without constant, trend and squared trend b)DF test or ADF test with or without trend, GLS estimation for de-meaning and de-trending c)KPSS (Kwiatkowski, Phillips, Schmidt, Shin) test nModel > Time Series > ARIMA nEstimates an ARMA model, with or without exogenous regressors March 23, 2018 Hackl, Econometrics 2, Lecture 3 •72 Your Homework 1.Use Greene’s data set GREENE18_1 (Corporate bond yields, 1990:01 to 1994:12) and answer the following questions for the variable YIELD (yield on Moody’s Aaa rated corporate bond). a)Using the model-statement “Ordinary Least Squares …” in Gretl, (i) regress ΔYIELD on YIELD-1 and an intercept and compute the DF test statistics for a unit root. What do you conclude (ii) about the presence of a unit root, about stationarity of YIELD? b)Produce a time series plot of YIELD. Interpret the graph in view of the results of a). c)Using Gretl, conduct ADF tests including (i) without and (ii) with a linear trend, and (iii) with seasonal dummies. What do you conclude about the presence of a unit root? Compare the results with those of a). d)Transform YIELD into its first differences d_YIELD. Repeat c) for the differences. What do you conclude? March 23, 2018 Hackl, Econometrics 2, Lecture 3 •73 Your Homework e)Determine the sample ACF and PACF for YIELD. What orders of the ARMA model for YIELD is suggested by these graphs? f)Estimate (i) an AR(1)- and (ii) an AR(2)-model for YIELD; (ii) test for autocorrelation in the residuals of the two models. What do you conclude? 2.For the random walk with trend Yt = δ + Yt-1 + εt, show that (a) V{Yt} = σ²t, and (b) Corr{Yt,Yt-k} = √(1-k/t). 3.For the AR(1) process Yt = θYt-1 + εt with white noise εt, show that (a) the ACF is ρk = θk, k = 0, ±1,…, and that (b) the PACF is θ00 = 1, θ11 = θ, θkk = 0 for k > 1. 4. n March 23, 2018