Abstract

Infinitesimal perturbation analysis is a widely used approach to assess the input sensitivities of stochas- tic dynamic systems in the classical simulation context. In this paper, we introduce an efficient nu- merical approach to undertake Infinitesimal perturbation analysis in the context of dependent Markov Chain Monte Carlo simulations widely used in Bayesian inference. Building on recent developments, we develop a scheme based on automatic differentiation to compute posterior sensitivities based on exact (up to computer floating point error) first-order derivatives of draws (Jacobians) with respect to prior input parameters alongside the estimation algorithm. Assessing such local robustness of posterior inference poses a challenge for existing methods such as finite differencing, symbolic differentiation, and likelihood ratio methods due to the complex stochastic dependence structure and computational intensity of dependent sample draws. The proposed methods allow for a comprehensive and exact local sensitivity analysis of Markov Chain Monte Carlo output with respect to all input parameters, i.e., prior hyper-parameters (prior robustness) and chain starting values (convergence). Our imple- mentation focuses on methods for Gibbs-based algorithms widely used for inference and forecasting in complex high-dimensional settings. We illustrate how the methods can help practitioners assess convergence and prior robustness in an application of Bayesian Vector Autoregression with shrinkage priors for US macroeconomic time series data and forecasting.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call