Abstract

Many modern statistical applications involve inference for complicated stochastic models for which the likelihood function is difficult or even impossible to calculate, and hence conventional likelihood-based inferential techniques cannot be used. In such settings, Bayesian inference can be performed using Approximate Bayesian Computation (ABC). However, in spite of many recent developments to ABC methodology, in many applications the computational cost of ABC necessitates the choice of summary statistics and tolerances that can potentially severely bias the estimate of the posterior.We propose a new “piecewise” ABC approach suitable for discretely observed Markov models that involves writing the posterior density of the parameters as a product of factors, each a function of only a subset of the data, and then using ABC within each factor. The approach has the advantage of side-stepping the need to choose a summary statistic and it enables a stringent tolerance to be set, making the posterior “less approximate”. We investigate two methods for estimating the posterior density based on ABC samples for each of the factors: the first is to use a Gaussian approximation for each factor, and the second is to use a kernel density estimate. Both methods have their merits. The Gaussian approximation is simple, fast, and probably adequate for many applications. On the other hand, using instead a kernel density estimate has the benefit of consistently estimating the true piecewise ABC posterior as the number of ABC samples tends to infinity. We illustrate the piecewise ABC approach with four examples; in each case, the approach offers fast and accurate inference.

Highlights

  • Stochastic models are commonly used to model processes in the physical sciences (Wilkinson 2011a; Van Kampen 2007)

  • These methods were originally developed for applications in population genetics (Pritchard et al 1999) and human demographics (Beaumont et al 2002), but are being used in a wide range of fields including epidemiology (McKinley et al 2009), evolution of species (Toni et al 2009), finance (Dean et al 2011), and evolution of pathogens (Gabriel et al 2010), to name a few

  • Two differences between Approximate Bayesian Computation (ABC) and piecewise ABC (PW-ABC) are clear: first, in ABC the conditioning is on the simulated trajectory, whereas in PW-ABC the conditioning is on the data; and second, in PW-ABC the convolution is with respect to a different kernel (29)

Read more

Summary

Introduction

Stochastic models are commonly used to model processes in the physical sciences (Wilkinson 2011a; Van Kampen 2007). Algorithm 1 generates exact samples from the Bayesian posterior density π(θ | x) which is proportional to π(x | θ )π(θ ) This algorithm is only of practical use if X(t) is discrete, else the acceptance probability in Step 3 is zero. Quantities based on (i) are denoted by superscript g, and those based on (ii) are denoted by superscript k In both cases we discuss the behaviour of the estimators in the asymptotic regime in which the number of observations, n, is kept fixed while the size of each ABC sample increases, m → ∞. These results generalise routinely to the case of a product of n kernel density estimates, that is, in which φik(θ ) since the is used as an estimator for θi∗(j) are independent for all φi(θ ) It i, j , follows that using (14)–. Where expressions for the covariances Bj2,...,jn , means aj2,...,jn , and weights wj2,...,jn , analogous to those in (8)– (10), are given in Appendix 1

Estimating the posterior density
An expression for the posterior density
Practical issues in drawing samples
Practical numerical calculations for the kernel approximation
Estimating the marginal likelihood
Examples
Binomial model
Cox–Ingersoll–Ross Model
An integer-valued autoregressive model
Stochastic Lotka–Volterra model
Conclusion and discussion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call