Abstract

Out-of-time-ordered correlators (OTOCs) are of crucial importance for studying a wide variety of fundamental phenomena in quantum physics, ranging from information scrambling to quantum chaos and many-body localization. However, apart from a few special cases, they are notoriously difficult to compute even numerically due to the exponential complexity of generic quantum many-body systems. In this paper, we introduce a machine learning approach to OTOCs based on the restricted-Boltzmann-machine architecture, which features wide applicability and could work for arbitrary-dimensional systems with massive entanglement. We show, through a concrete example involving a two-dimensional transverse field Ising model, that our method is capable of computing early-time OTOCs with respect to random pure quantum states or infinite-temperature thermal ensembles. Our results showcase the great potential for machine learning techniques in computing OTOCs, which open up numerous directions for future studies related to similar physical quantities.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.