Abstract

Training quantum neural networks (QNNs) using gradient-based or gradient-free classical optimization approaches is severely impacted by the presence of barren plateaus in the cost landscapes. In this paper, we devise a framework for leveraging quantum optimization algorithms to find optimal parameters of QNNs for certain tasks. To cast the optimization problem of training QNN into the context of quantum optimization, the parameters in QNN are quantized—moved from being classical to being stored in quantum registers which are in addition to those upon which the QNN is performing its computation. We then coherently encode the cost function of QNNs onto relative phases of a superposition state in the Hilbert space of the QNN parameters. The parameters are tuned with an iterative quantum optimization structure using adaptively selected Hamiltonians. The quantum mechanism of this framework exploits hidden structure in the QNN optimization problem and hence is expected to provide beyond-Grover speed up, mitigating the barren plateau issue.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call