Abstract
Part of the challenge for quantum many-body problems comes from the difficulty of representing large-scale quantum states, which in general requires an exponentially large number of parameters. Neural networks provide a powerful tool to represent quantum many-body states. An important open question is what characterizes the representational power of deep and shallow neural networks, which is of fundamental interest due to the popularity of deep learning methods. Here, we give a proof that, assuming a widely believed computational complexity conjecture, a deep neural network can efficiently represent most physical states, including the ground states of many-body Hamiltonians and states generated by quantum dynamics, while a shallow network representation with a restricted Boltzmann machine cannot efficiently represent some of those states.
Highlights
Part of the challenge for quantum many-body problems comes from the difficulty of representing large-scale quantum states, which in general requires an exponentially large number of parameters
We prove that deep Boltzmann machine (DBM) can efficiently represent most physical states, including the ground states of many-body Hamiltonians and states generated by quantum dynamics, while restricted Boltzmann machine (RBM) cannot efficiently represent some of those states
For the limitation of RBMs, we introduce an explicit class of states which can be generated either by a polynomial-size quantum circuit or as ground states of gapped Hamiltonians, and prove for those states there is no efficient RBM representation unless the polynomial hierarchy, a generalization of the famous P versus NP problem in computer science, collapses, which is widely believed to be unlikely
Summary
Part of the challenge for quantum many-body problems comes from the difficulty of representing large-scale quantum states, which in general requires an exponentially large number of parameters. We give a proof that, assuming a widely believed computational complexity conjecture, a deep neural network can efficiently represent most physical states, including the ground states of many-body Hamiltonians and states generated by quantum dynamics, while a shallow network representation with a restricted Boltzmann machine cannot efficiently represent some of those states. Numerical evidence suggests that the restricted Boltzmann machine (RBM), a shallow generative neural network, optimized by the reinforcement learning method, provides a good solution to several many-body models[5]. Given this success, an important open question is what characterizes the representational power and limitations of the RBM for quantum many-body states. The result shows there exists an exponential separation in efficiency between using DBMs or RBMs to represent quantum many-body states
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.