Abstract

Sampling from complex distributions is an important but challenging topic in scientific and statistical computation. We synthesize three ideas, tempering, resampling, and Markov moving, and propose a general framework of resampling Markov chain Monte Carlo (MCMC). This framework not only accommodates various existing algorithms, including resample-move, importance resampling MCMC, and equi-energy sampling, but also leads to a generalized resample-move algorithm. We provide some basic analysis of these algorithms within the general framework, and present three simulation studies to compare these algorithms together with parallel tempering in the difficult situation where new modes emerge in the tails of previous tempering distributions. Our analysis and empirical results suggest that generalized resample-move tends to perform the best among all the algorithms studied when the Markov kernels lead to fast mixing or even locally so toward restricted distributions, whereas parallel tempering tends to perform the best when the Markov kernels lead to slow mixing, without even converging fast to restricted distributions. Moreover, importance resampling MCMC and equi-energy sampling perform similarly to each other, often worse than independence Metropolis resampling MCMC. Therefore, different algorithms seem to have advantages in different settings.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call