Abstract

We compare the BFGS optimizer, ADAM and Natural Gradient Descent (NatGrad) in the context of Variational Quantum Eigensolvers (VQEs). We systematically analyze their performance on the QAOA ansatz for the Transverse Field Ising Model (TFIM) as well as on overparametrized circuits with the ability to break the symmetry of the Hamiltonian. The BFGS algorithm is frequently unable to find a global minimum for systems beyond about 20 spins and ADAM easily gets trapped in local minima. On the other hand, NatGrad shows stable performance on all considered system sizes, albeit at a significantly higher cost per epoch. In sharp contrast to most classical gradient based learning, the performance of all optimizers is found to decrease upon seemingly benign overparametrization of the ansatz class, with BFGS and ADAM failing more often and more severely than NatGrad. Additional tests for the Heisenberg XXZ model corroborate the accuracy problems of BFGS in high dimensions, but they reveal some shortcomings of NatGrad as well. Our results suggest that great care needs to be taken in the choice of gradient based optimizers and the parametrization for VQEs.

Highlights

  • Variational quantum algorithms such as the variational quantum eigensolver (VQE) or the quantum approximate optimization algorithm (QAOA) [1] have received a lot of attention of late

  • We start our numerical investigation with the QAOA circuit for the TFIM on N qubits with critical transverse field t = 1 and analyze the accuracy, speed and stability of all three optimizers BFGS, adaptive moment estimation (ADAM) and NatGrad

  • As we show in Appendix B, reducing the learning rate makes bigger system sizes accessible to ADAM, and rather drastically increases run times because of slower convergence

Read more

Summary

Introduction

Variational quantum algorithms such as the variational quantum eigensolver (VQE) or the quantum approximate optimization algorithm (QAOA) [1] have received a lot of attention of late. The adaptive moment estimation (ADAM) optimizer [5] is among the most widely used and recommended algorithms [6,7], and has been one of the most important enablers of progress in deep learning in recent years. Such an accurate and versatile optimizer for quantum variational algorithms is yet to be found

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call