Abstract

Out-of-time-ordered correlators (OTOCs) are of crucial importance for studying a wide variety of fundamental phenomena in quantum physics, ranging from information scrambling to quantum chaos and many-body localization. However, apart from a few special cases, they are notoriously difficult to compute even numerically due to the exponential complexity of generic quantum many-body systems. In this paper, we introduce a machine learning approach to OTOCs based on the restricted-Boltzmann-machine architecture, which features wide applicability and could work for arbitrary-dimensional systems with massive entanglement. We show, through a concrete example involving a two-dimensional transverse field Ising model, that our method is capable of computing early-time OTOCs with respect to random pure quantum states or infinite-temperature thermal ensembles. Our results showcase the great potential for machine learning techniques in computing OTOCs, which open up numerous directions for future studies related to similar physical quantities.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call