Abstract

Variational Quantum Algorithms (VQAs) may be a path to quantum advantage on Noisy Intermediate-Scale Quantum (NISQ) computers. A natural question is whether noise on NISQ devices places fundamental limitations on VQA performance. We rigorously prove a serious limitation for noisy VQAs, in that the noise causes the training landscape to have a barren plateau (i.e., vanishing gradient). Specifically, for the local Pauli noise considered, we prove that the gradient vanishes exponentially in the number of qubits n if the depth of the ansatz grows linearly with n. These noise-induced barren plateaus (NIBPs) are conceptually different from noise-free barren plateaus, which are linked to random parameter initialization. Our result is formulated for a generic ansatz that includes as special cases the Quantum Alternating Operator Ansatz and the Unitary Coupled Cluster Ansatz, among others. For the former, our numerical heuristics demonstrate the NIBP phenomenon for a realistic hardware noise model.

Highlights

  • Variational Quantum Algorithms (VQAs) may be a path to quantum advantage on Noisy Intermediate-Scale Quantum (NISQ) computers

  • While performing numerical heuristics for small or intermediate problem sizes is the norm for VQAs, deriving analytical scaling results is rare for this field

  • This includes the Quantum Alternating Operator Ansatz (QAOA) which is used for solving combinatorial optimization problems[13,14,15,16] and the Unitary Coupled Cluster (UCC) Ansatz which is used in the Variational Quantum Eigensolver (VQE) to solve chemistry problems[50,51,52]

Read more

Summary

Introduction

Variational Quantum Algorithms (VQAs) may be a path to quantum advantage on Noisy Intermediate-Scale Quantum (NISQ) computers. For the local Pauli noise considered, we prove that the gradient vanishes exponentially in the number of qubits n if the depth of the ansatz grows linearly with n These noise-induced barren plateaus (NIBPs) are conceptually different from noise-free barren plateaus, which are linked to random parameter initialization. It was proven that the gradient vanishes exponentially in n for randomly initialized, deep Hardware Efficient ansatzes[31,32] and dissipative quantum neural networks[33], and for shallow depth with global cost functions[34] This vanishing gradient phenomenon is referred to as barren plateaus in the training landscape. It is typical to consider L scaling as poly(n) (e.g., in the UCC Ansatz52), for which our main result implies an exponential decay of the gradient in n We refer to this as a Noise-Induced Barren Plateau (NIBP). Strategies to avoid noise-free barren plateaus[34,37,40,41,42,43] do not appear to solve the NIBPs issue

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.