Abstract

A core technology that has emerged from the artificial intelligence revolution is the recurrent neural network (RNN). Its unique sequence-based architecture provides a tractable likelihood estimate with stable training paradigms, a combination that has precipitated many spectacular advances in natural language processing and neural machine translation. This architecture also makes a good candidate for a variational wave function, where the RNN parameters are tuned to learn the approximate ground state of a quantum Hamiltonian. In this paper, we demonstrate the ability of RNNs to represent several many-body wave functions, optimizing the variational parameters using a stochastic approach. Among other attractive features of these variational wave functions, their autoregressive nature allows for the efficient calculation of physical estimators by providing independent samples. We demonstrate the effectiveness of RNN wave functions by calculating ground state energies, correlation functions, and entanglement entropies for several quantum spin models of interest to condensed matter physicists in one and two spatial dimensions.

Highlights

  • The last decade has marked the start of a worldwide artificial intelligence (AI) revolution, which is dramatically affecting industry, science, and society

  • We have introduced recurrent neural network wave functions, a variational ansatz for quantum many-body systems, which we use to approximate ground-state energies, correlation functions, and entanglement of many-body Hamiltonians of interest to condensed-matter physics

  • We find that recurrent neural network (RNN) wave functions are competitive with state-of-the-art methods such as density-matrix renormalizationgroup (DMRG) and PixelCNN wave functions [56], performing well on the task of finding the ground state of the transverse field Ising model in two dimensions

Read more

Summary

INTRODUCTION

The last decade has marked the start of a worldwide artificial intelligence (AI) revolution, which is dramatically affecting industry, science, and society. Some of the most important algorithmic advances in NLP have been developed in the context of sequence learning using recurrent neural networks (RNNs) [20,21,22,23,24] These have resulted in impressive results in speech and text comprehension, as well as in state-of-the-art results in neural machine translation. We explore whether the power and scalability of NLP models such as the RNN can be extended to applications in physical systems, in particular to perform variational calculations to find the low-energy states of quantum many-body Hamiltonians. We show that the intrinsic bias of our ansatz can be systematically reduced to yield highly accurate ground-state approximations of large quantum systems

RNNs for classical probability distributions
RNN wave functions
GROUND STATES WITH RNN WAVE FUNCTIONS
Scaling of resources
CONCLUSIONS AND OUTLOOK
Imposing discrete symmetries
Findings
Imposing zero magnetization
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call