Abstract

It is known that the distribution of nonreversible Markov processes breaking the detailed balance condition converges faster to the stationary distribution compared to reversible processes having the same stationary distribution. This is used in practice to accelerate Markov chain Monte Carlo algorithms that sample the Gibbs distribution by adding nonreversible transitions or nongradient drift terms. The breaking of detailed balance also accelerates the convergence of empirical estimators to their ergodic expectation in the long-time limit. Here, we give a physical interpretation of this second form of acceleration in terms of currents associated with the fluctuations of empirical estimators using the level 2.5 of large deviations, which characterizes the likelihood of density and current fluctuations in Markov processes. Focusing on diffusion processes, we show that there is accelerated convergence because estimator fluctuations arise in general with current fluctuations, leading to an added large deviation cost compared to the reversible case, which shows no current. We study the current fluctuation most likely to arise in conjunction with a given estimator fluctuation and provide bounds on the acceleration, based on approximations of this current. We illustrate these results for the Ornstein-Uhlenbeck process in two dimensions and the Brownian motion on the circle.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call