Abstract

Quantum Architecture Search (QAS) has shown significant promise in designing quantum circuits for Variational Quantum Algorithms (VQAs). However, existing QAS algorithms primarily explore circuit architectures within a discrete space, which is inherently inefficient. In this paper, we propose a Gradient-based Optimization for Quantum Architecture Search (GQAS), which leverages a circuit encoder, decoder, and predictor. Initially, the encoder embeds circuit architectures into a continuous latent representation. Subsequently, a predictor utilizes this continuous latent representation as input and outputs an estimated performance for the given architecture. The latent representation is then optimized through gradient descent within the continuous latent space based on the predicted performance. The optimized latent representation is finally mapped back to a discrete architecture via the decoder. To enhance the quality of the latent representation, we pre-train the encoder on a substantial dataset of circuit architectures using Self-Supervised Learning (SSL). Our simulation results on the Variational Quantum Eigensolver (VQE) indicate that our method outperforms the current Differentiable Quantum Architecture Search (DQAS).

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.