Abstract

The efficiency of Monte Carlo samplers is dictated not only by energetic effects, such as large barriers, but also by entropic effects that are due to the sheer volume that is sampled. The latter effects appear in the form of an entropic mismatch or divergence between the direct and reverse trial moves. We provide lower and upper bounds for the average acceptance probability in terms of the Rényi divergence of order 1/2 . We show that the asymptotic finitude of the entropic divergence is the necessary and sufficient condition for nonvanishing acceptance probabilities in the limit of large dimension. Furthermore, we demonstrate that the upper bound is reasonably tight by showing that the exponent is asymptotically exact for systems made up of a large number of independent and identically distributed subsystems. For the last statement, we provide an alternative proof that relies on the reformulation of the acceptance probability as a large deviation problem. The reformulation also leads to a class of low-variance estimators for strongly asymmetric distributions. We show that the entropy divergence causes a decay in the average displacements with the number of dimensions n that are simultaneously updated. For systems that have a well-defined thermodynamic limit, the decay is demonstrated to be n(-1/2) for random-walk Monte Carlo and n(-1/6) for smart Monte Carlo (SMC). Numerical simulations of the Lennard-Jones 38 (LJ(38)) cluster show that SMC is virtually as efficient as the Markov chain implementation of the Gibbs sampler, which is normally utilized for Lennard-Jones clusters. An application of the entropic inequalities to the parallel tempering method demonstrates that the number of replicas increases as the square root of the heat capacity of the system.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call