Quantum generative models hold the promise of accelerating or improving machine learning tasks by leveraging the probabilistic nature of quantum states, but the successful optimization of these models remains a difficult challenge. To tackle this challenge, we present a new architecture for quantum generative modeling that combines insights from classical machine learning and quantum phases of matter. In particular, our model utilizes both many-body localized (MBL) dynamics and hidden units to improve the optimization of the model. We demonstrate the applicability of our model on a diverse set of classical and quantum tasks, including a toy version of MNIST handwritten digits, quantum data obtained from quantum many-body states, and nonlocal parity data. Our architecture and algorithm provide novel strategies of utilizing quantum many-body systems as learning resources and reveal a powerful connection between disorder, interaction, and learning in quantum many-body systems. Published by the American Physical Society 2024
Read full abstract