Abstract

ABSTRACTWe present an offline, iterated particle filter to facilitate statistical inference in general state space hidden Markov models. Given a model and a sequence of observations, the associated marginal likelihood L is central to likelihood-based inference for unknown statistical parameters. We define a class of “twisted” models: each member is specified by a sequence of positive functions and has an associated -auxiliary particle filter that provides unbiased estimates of L. We identify a sequence that is optimal in the sense that the -auxiliary particle filter’s estimate of L has zero variance. In practical applications, is unknown so the -auxiliary particle filter cannot straightforwardly be implemented. We use an iterative scheme to approximate and demonstrate empirically that the resulting iterated auxiliary particle filter significantly outperforms the bootstrap particle filter in challenging settings. Applications include parameter estimation using a particle Markov chain Monte Carlo algorithm.

Highlights

  • Particle filtering, or sequential Monte Carlo (SMC), methodology involves the simulation over time of an artificial particle system

  • To compare the efficiency of the iterated auxiliary particle filter (iAPF) and the bootstrap particle filter (BPF) within a particle marginal Metropolis–Hastings algorithm (PMMH) algorithm, we analyzed a sequence of T = 945 observations y1:T, which are mean-corrected daily returns computed from weekday close exchange rates r1:T+1 for the pound/dollar from 1/10/81 to 28/6/85

  • We have presented the iAPF, an offline algorithm that approximates an idealized particle filter, whose marginal likelihood estimates have zero variance

Read more

Summary

Introduction

Sequential Monte Carlo (SMC), methodology involves the simulation over time of an artificial particle system (ξti; t ∈ {1, . . . , T }, i ∈ {1, . . . , N}). Corresponds exactly to running the bootstrap particle filter (BPF) of Gordon, Salmond, and Smith (1993), and we observe that when (4) holds, the quantity Z defined in (1) is identical to L, so that ZN defined in (2) is an approximation of L. This work builds upon a number of methodological advances, most notably the twisted particle filter (Whiteley and Lee 2014), the APF (Pitt and Shephard 1999), block sampling (Doucet, Briers, and Sénécal 2006), and look-ahead schemes (Lin et al 2013). The sequence ψ∗ is closely related to the generalized eigenfunctions described in Whiteley and Lee (2014), but in that work the particle filter as opposed to the HMM was twisted to define alternative approximations of L. Generalization to the time-inhomogeneous HMM setting is fairly straightforward, so we restrict ourselves to the timehomogeneous setting for clarity of exposition

Twisted Models and the ψ-Auxiliary Particle Filter
Asymptotic Variance of the ψ-APF
Classes of f and ψ
The Iterated Auxiliary Particle Filter
Approximations of Smoothing Expectations
Applications and Examples
Implementation Details
Linear Gaussian Model
Univariate Stochastic Volatility Model
Multivariate Stochastic Volatility Model
Findings
Discussion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call