Vector Autoregression

Vector Autoregression (VAR) is a way to model multiple time series that affect each other.

Standard autoregression: predict a variable from its own past values. Vector autoregression: predict multiple variables from all their past values, including each other’s.

Say you want to model GDP, interest rates, and unemployment. In a VAR model, each variable at time t depends on lagged values of all three:

Yt=c+Φ1Yt1+Φ2Yt2+...+ϵtY_t = c + \Phi_1 Y_{t-1} + \Phi_2 Y_{t-2} + ... + \epsilon_t

Where YtY_t is a vector of all your variables and the Φ\Phi matrices capture how past values influence current ones.

What you can do with it:

Forecasting. Once you’ve estimated the coefficients, you can project forward. Each variable’s forecast incorporates information from all the others.

Impulse response. What happens to the system if you shock one variable? Bump up interest rates unexpectedly and trace through how GDP and unemployment respond over time.

Variance decomposition. How much of the movement in unemployment is explained by its own history vs. shocks to GDP or interest rates?

The catch: VAR models are atheoretical. They capture correlations and dynamics, but they don’t tell you about causation. And with many variables and lags, you’re estimating a lot of parameters - easy to overfit.

Useful when you have interrelated time series and want to understand or forecast their joint behavior. Common in macroeconomics and finance where everything affects everything else.