Abstract

Hybrid Monte Carlo (HMC) has been widely applied to numerous posterior inference problems in machine learning and statistics. HMC has two main practical issues, the first is the deterioration in acceptance rates as the system size increases and the second is its sensitivity to two user-specified parameters: the step size and trajectory length. The former issue is addressed by sampling from an integrator-dependent modified or shadow density and compensating for the induced bias via importance sampling. The latter issue is addressed by adaptively setting the HMC parameters, with the state-of-the-art method being the No-U-Turn Sampler (NUTS). We combine the benefits of NUTS with those attained by sampling from the shadow density, by adaptively setting the trajectory length and step size of Separable Shadow Hamiltonian Hybrid Monte Carlo (S2HMC). This leads to a new algorithm, adaptive S2HMC (A-S2HMC), that shows improved performance over S2HMC and NUTS across various targets and leaves the target density invariant.

Highlights

  • Hybrid Monte Carlo (HMC) [1] is a widely employed inference method for sampling complex posterior distributions in machine learning and statistics

  • We show that our adaptive shadow separable Hamiltonian Hybrid Monte Carlo (A-S2HMC) method achieves better exploration of the posterior and higher effective sample sizes than the standard separable Hamiltonian Hybrid Monte Carlo across various benchmarks, while leaving the target density invariant

  • To show that adaptive S2HMC (A-S2HMC) satisfies detailed balance we show that a No-U-Turn Sampler (NUTS) algorithm that uses the processed leapfrog algorithm is both sympletic and reversible

Read more

Summary

INTRODUCTION

Hybrid Monte Carlo (HMC) [1] is a widely employed inference method for sampling complex posterior distributions in machine learning and statistics. An approach to address the deterioration in acceptance rates with increases in system size is the use of modified or shadow Hamiltonian based samplers These modified Hamiltonian methods leverage backward error analysis of the numerical integrator, which results in higher order conservation of the shadow Hamiltonian relative to the true Hamiltonian [14]. Heide et al [11] derive a nonseparable shadow Hamiltonian for the generalised leapfrog integrator in Riemannian manifold Hamiltonian Monte Carlo (RMHMC), which results in improved performance relative to sampling from the true Hamiltonian These methods still suffer from the practical impediments of setting the integration step size and trajectory lengths. We overcome this constraint by relying on a separable shadow Hamiltonian based sampler This allows for both tuning of step sizes through dual averaging and the trajectory length via a binary tree recursion as in Hoffman and Gelman [12].

HAMILTONIAN MONTE CARLO
ADAPTIVE SEPARABLE SHADOW HYBRID HAMILTONIAN MONTE CARLO
NUMERICAL EXPERIMENTS
JUMP DIFFUSION PROCESS
NEAL’S FUNNEL DENSITY
BAYESIAN LOGISTIC REGRESSION
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call