Abstract

Parameterized quantum circuits (PQCs) have been broadly used as a hybrid quantum-classical machine learning scheme to accomplish generative tasks. However, whether PQCs have better expressive power than classical generative neural networks, such as restricted or deep Boltzmann machines, remains an open issue. In this paper, we prove that PQCs with a simple structure already outperform any classical neural network for generative tasks, unless the polynomial hierarchy collapses. Our proof builds on known results from tensor networks and quantum circuits (in particular, instantaneous quantum polynomial circuits). In addition, PQCs equipped with ancillary qubits for post-selection have even stronger expressive power than those without post-selection. We employ them as an application for Bayesian learning, since it is possible to learn prior probabilities rather than assuming they are known. We expect that it will find many more applications in semi-supervised learning where prior distributions are normally assumed to be unknown. Lastly, we conduct several numerical experiments using the Rigetti Forest platform to demonstrate the performance of the proposed Bayesian quantum circuit.

Highlights

  • There is a ubiquitous belief called “quantum supremacy” that quantum computers will outperform classical computers [1]

  • We provide a rigorous proof that, given the number of trainable parameters that polynomially scale with the number of qubits or visible neurons N, the bond dimensions represented by multiplelayer PQCs (MPQCs), TQPCs, the deep Boltzmann machine (DBM), and the long-range restricted Boltzmann machine (RBM) scale with O( exp(N )), while the bond dimensions represented by the short-range RBM scale with

  • Characterized by the entanglement entropy, we prove that MPQCs, tensor network PQCs (TPQCs), long-range RBMs, and long-range DBMs can efficiently simulate matrix product states (MPSs) that the corresponding bond dimensions exponentially scale with the number of inputs N, which cannot be efficiently simulated by the short-range RBMs

Read more

Summary

INTRODUCTION

There is a ubiquitous belief called “quantum supremacy” that quantum computers will outperform classical computers [1]. PQCs are capable of solving many kinds of machine learning tasks and classical machine learning methods have been extensively applied to physics research To analyze their relationships, we first prove that MPQCs can be formulated by the tensor network language. We further prove that instantaneous quantum polytime (IQP) circuits [41] are a special subclass of MPQCs. The probability distribution generated by IQP cannot be sampled efficiently and accurately by any classical neural network [3]. The probability distribution generated by IQP cannot be sampled efficiently and accurately by any classical neural network [3] This indicates that, from the perspective of complexity theory, MPQCs have an expressive power stronger than that of classical neural networks and have the potential to become a practical application with quantum supremacy [42]. The BQC experiments are implemented in PYTHON, leveraging the pyQuil library to access the numerical simulator known as the quantum virtual machine (QVM) [44]

Boltzmann machine
Tensor networks
Entanglement entropy
Quantum circuits
Parametrized quantum circuits
Multilayer parametrized quantum circuits
Tensor network parametrized quantum circuits
Main result
Proof sketch of Theorem 1
BAYESIAN QUANTUM CIRCUIT
Layouts and optimization of the BQC
Expressive power of the BQC and AD-MPQCs
Generating bar-and-stripe dataset
Learning prior distribution
CONCLUSION AND DISCUSSION
Findings
Methods
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call