Abstract

Monte Carlo methods have become essential tools to solve complex Bayesian inference problems in different fields, such as computational statistics, machine learning, and statistical signal processing. In this work, we introduce a novel class of adaptive Monte Carlo methods, called adaptive independent sticky Markov Chain Monte Carlo (MCMC) algorithms, to sample efficiently from any bounded target probability density function (pdf). The new class of algorithms employs adaptive non-parametric proposal densities, which become closer and closer to the target as the number of iterations increases. The proposal pdf is built using interpolation procedures based on a set of support points which is constructed iteratively from previously drawn samples. The algorithm’s efficiency is ensured by a test that supervises the evolution of the set of support points. This extra stage controls the computational cost and the convergence of the proposal density to the target. Each part of the novel family of algorithms is discussed and several examples of specific methods are provided. Although the novel algorithms are presented for univariate target densities, we show how they can be easily extended to the multivariate context by embedding them within a Gibbs-type sampler or the hit and run algorithm. The ergodicity is ensured and discussed. An overview of the related works in the literature is also provided, emphasizing that several well-known existing methods (like the adaptive rejection Metropolis sampling (ARMS) scheme) are encompassed by the new class of algorithms proposed here. Eight numerical examples (including the inference of the hyper-parameters of Gaussian processes, widely used in machine learning for signal processing applications) illustrate the efficiency of sticky schemes, both as stand-alone methods to sample from complicated one-dimensional pdfs and within Gibbs samplers in order to draw from multi-dimensional target distributions.

Highlights

  • Markov chain Monte Carlo (MCMC) methods [1, 2] are very important tools for Bayesian inference and numerical approximation, which are widely employed in signal processing [3,4,5,6,7] and other related fields [1, 8]

  • 5 Theoretical results we provide some theoretical results regarding the ergodicity of the proposed approach, the convergence of a sticky proposal to the target, and the expected growth of the number of support points of the proposal

  • 9.4 Sticky MCMC methods within Gibbs sampling 9.4.1 Example 1: comparing different MCMC-within-Gibbs schemes In this example we show that, even in a simple bivariate scenario, adaptive independent sticky Metropolis (AISM) schemes can be useful within a Gibbs sampler

Read more

Summary

Introduction

Markov chain Monte Carlo (MCMC) methods [1, 2] are very important tools for Bayesian inference and numerical approximation, which are widely employed in signal processing [3,4,5,6,7] and other related fields [1, 8]. We describe a general framework to design suitable adaptive MCMC algorithms with nonparametric proposal densities. We propose a more efficient scheme that is based on the multiple try Metropolis (MTM) algorithm: the adaptive independent sticky Multiple Try Metropolis (AISMTM) method. A very general framework, that allows practitioners to design proper adaptive MCMC methods by employing a non-parametric proposal. We describe the simplest possible sticky method, obtained by using the MH algorithm, whereas in Section 7 we consider a more sophisticated technique that employs the MTM scheme.

Adaptive independent sticky Metropolis
Test to update St
Examples of constructions
Examples of update rules
Other examples of sticky MCMC methods
Adaptive independent sticky MTM
MTM step
Update rules for AISMTM
Range of applicability and multivariate generation
Sticky MCMC methods within Recycling Gibbs sampling
Automatic Relevant Determination kernel function
10 Conclusions
Proof of Theorem 3
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call