Abstract

Approximate Bayesian computation has emerged as a standard computational tool when dealing with intractable likelihood functions in Bayesian inference. We show that many common Markov chain Monte Carlo kernels used to facilitate inference in this setting can fail to be variance bounding and hence geometrically ergodic, which can have consequences for the reliability of estimates in practice. This phenomenon is typically independent of the choice of tolerance in the approximation. We prove that a recently introduced Markov kernel can inherit the properties of variance bounding and geometric ergodicity from its intractable Metropolis–Hastings counterpart, under reasonably weak conditions. We show that the computational cost of this alternative kernel is bounded whenever the prior is proper, and present indicative results for an example where spectral gaps and asymptotic variances can be computed, as well as an example involving inference for a partially and discretely observed, time-homogeneous, pure jump Markov process. We also supply two general theorems, one providing a simple sufficient condition for lack of variance bounding for reversible kernels and the other providing a positive result concerning inheritance of variance bounding and geometric ergodicity for mixtures of reversible kernels.

Highlights

  • Approximate Bayesian computation refers to branch of Monte Carlo methodology that uses the ability to simulate data according to a parametrized likelihood function in lieu of computation of that likelihood to perform approximate, parametric Bayesian inference

  • Consistent estimation of var(P, φ) is well established (Hobert et al, 2002; Jones et al, 2006; Bednorz & Łatuszynski, 2007; Flegal & Jones, 2010) for geometrically ergodic chains. We study both the variance bounding (Roberts & Rosenthal, 2008) and geometric ergodicity properties of a number of reversible kernels used for approximate Bayesian computation

  • As a partial remedy to the problems identified by this negative result, we show that under reasonably mild conditions, a kernel proposed in Lee et al (2012) can inherit variance bounding and geometric ergodicity from its intractable Metropolis–Hastings (Metropolis et al, 1953; Hastings, 1970) counterpart

Read more

Summary

Introduction

Approximate Bayesian computation refers to branch of Monte Carlo methodology that uses the ability to simulate data according to a parametrized likelihood function in lieu of computation of that likelihood to perform approximate, parametric Bayesian inference. Consistent estimation of var(P, φ) is well established (Hobert et al, 2002; Jones et al, 2006; Bednorz & Łatuszynski, 2007; Flegal & Jones, 2010) for geometrically ergodic chains Motivated by these considerations, we study both the variance bounding (Roberts & Rosenthal, 2008) and geometric ergodicity properties of a number of reversible kernels used for approximate Bayesian computation. As a partial remedy to the problems identified by this negative result, we show that under reasonably mild conditions, a kernel proposed in Lee et al (2012) can inherit variance bounding and geometric ergodicity from its intractable Metropolis–Hastings (Metropolis et al, 1953; Hastings, 1970) counterpart.

The Markov kernels
With probability
Theoretical properties
A posterior with compact support
Geometric distribution
Stochastic Lotka–Volterra model
Discussion
A Proofs
B Supplementary proofs
C Negative results in other settings
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call