Abstract

Quantum Architecture Search (QAS) has shown significant promise in designing quantum circuits for Variational Quantum Algorithms (VQAs). However, existing QAS algorithms primarily explore circuit architectures within a discrete space, which is inherently inefficient. In this paper, we propose a Gradient-based Optimization for Quantum Architecture Search (GQAS), which leverages a circuit encoder, decoder, and predictor. Initially, the encoder embeds circuit architectures into a continuous latent representation. Subsequently, a predictor utilizes this continuous latent representation as input and outputs an estimated performance for the given architecture. The latent representation is then optimized through gradient descent within the continuous latent space based on the predicted performance. The optimized latent representation is finally mapped back to a discrete architecture via the decoder. To enhance the quality of the latent representation, we pre-train the encoder on a substantial dataset of circuit architectures using Self-Supervised Learning (SSL). Our simulation results on the Variational Quantum Eigensolver (VQE) indicate that our method outperforms the current Differentiable Quantum Architecture Search (DQAS).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call