We propose an approach for learning probability distributions as differentiable quantum circuits (DQC) that enable efficient quantum generative modeling (QGM) and synthetic data generation. Contrary to existing QGM approaches, we perform training of a DQC-based model, where data is encoded in a latent space with the proposed phase feature map of exponential capacity. This is followed by a trainable quantum circuit, forming the model. We then map the trained model to the bit basis using a fixed unitary transformation, in this case corresponding to a quantum Fourier transform circuit. It allows fast sampling from parametrized distributions using a single-shot readout. Importantly, latent space training provides models that are automatically differentiable, and we show how samples from solutions of stochastic differential equations (SDEs) can be accessed by solving stationary and time-dependent Fokker-Planck equations with a quantum protocol. Our approach opens a route to multidimensional generative modeling with qubit registers explicitly correlated via a (fixed) entangling layer. In this case quantum computers can offer advantage as efficient samplers, which perform complex inverse transform sampling enabled by the fundamental laws of quantum mechanics. On a technical side the advances are multiple, as we introduce the phase feature map, analyze its properties, and develop frequency-taming techniques that include qubitwise training and feature map sparsification. Published by the American Physical Society 2024
Read full abstract