Abstract

SummaryWe propose a straightforward algorithm to estimate large Bayesian time‐varying parameter vector autoregressions with mixture innovation components for each coefficient in the system. The computational burden becomes manageable by approximating the mixture indicators driving the time‐variation in the coefficients with a latent threshold process that depends on the absolute size of the shocks. Two applications illustrate the merits of our approach. First, we forecast the US term structure of interest rates and demonstrate forecast gains relative to benchmark models. Second, we apply our approach to US macroeconomic data and find significant evidence for time‐varying effects of a monetary policy tightening.

Highlights

  • In the last few years, economists in policy institutions and central banks were criticized for their failure to foresee the recent financial crisis that engulfed the world economy and led to a sharp drop in economic activity

  • Do all regression parameters vary over time? Or is time variation just limited to a specific subset of the parameter space? as is the case with virtually any modeling problem, the question whether a given variable should be included in the model in the first place naturally arises

  • Rij,0 and rij,1 denote scalar hyperparameters. This choice implies that we artificially bound θij,1 away from zero, implying that in the upper regime we do not exert strong shrinkage. This is in contrast to a standard time-varying parameter model, where this prior is usually set rather tight to control the degree of time variation in the parameters

Read more

Summary

Introduction

In the last few years, economists in policy institutions and central banks were criticized for their failure to foresee the recent financial crisis that engulfed the world economy and led to a sharp drop in economic activity. The majority of forecasting models adopted were (and possibly still are) linear and low dimensional The former implies that the underlying structural mechanisms and the volatility of economic shocks are assumed to remain constant over time – a rather restrictive assumption. This renders estimation of large dimensional models like vector autoregressions (VARs) unfeasible To circumvent such problems, Koop, Leon-Gonzalez, and Strachan (2009) estimate a single Bernoulli random variable to discriminate between time constancy and parameter variation for the autoregressive coefficients, the covariances, and the log-volatilities, respectively.

Econometric framework
Mitigating the computational burden through thresholding
A multivariate extension with stochastic volatility
Prior specification
Posterior simulation
An illustrative example
Forecasting the US term structure of interest rates
Competing models
Structural breaks in US macroeconomic data
Detecting time-variation in reduced form coefficients
Impulse responses to a monetary policy shock
Closing remarks
A Convergence and mixing properties
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.