Abstract

Efficient sampling of complex high-dimensional probability distributions is a central task in computational science. Machine learning methods like autoregressive neural networks, used with Markov chain Monte Carlo sampling, provide good approximations to such distributions, but suffer from either intrinsic bias or high variance. In this Letter, we propose a way to make this approximation unbiased and with low variance. Our method uses physical symmetries and variable-size cluster updates which utilize the structure of autoregressive factorization. We test our method for first- and second-order phase transitions of classical spin systems, showing its viability for critical systems and in the presence of metastable states.

Highlights

  • Markov chain Monte Carlo (MCMC) [1] is an unbiased numerical method that allows sampling from unnormalized probability distributions, a central task in many areas of computational science

  • We first show that existing unbiased sampling schemes using global updates proposed by Generative neural samplers (GNSs) can be plagued by the ergodicity issue due to the generic presence of “exponentially suppressed configurations (ESCs),” which have a limited effect on the variational free energy but a rather strong effect on the autocorrelation time

  • In this Letter, we have shown a strategy to systematically remove the bias of variational autoregressive neural network methods and, at the same time, keep the variance of observables under control

Read more

Summary

INTRODUCTION

Markov chain Monte Carlo (MCMC) [1] is an unbiased numerical method that allows sampling from unnormalized probability distributions, a central task in many areas of computational science. We first show that existing unbiased sampling schemes using global updates proposed by GNS can be plagued by the ergodicity issue due to the generic presence of “exponentially suppressed configurations (ESCs),” which have a limited effect on the variational free energy but a rather strong effect on the autocorrelation time. Our workaround to this problem consists of two ingredients. We show that the method greatly alleviates the metastability issue, as it can rapidly thermalize by cluster updates

Bias in neural sampling
NIS and global updates
Symmetry-enforcing updates
Neural cluster updates
METHODS
1: Input the current configuration s 2
Ising model
CONCLUSIONS

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.