Abstract

Strongly correlated quantum many-body states provide invaluable resources for state-of-the-art quantum information science, ranging from quantum simulation to quantum metrology. In this work, we propose a supervised machine learning algorithm for optimal control of quantum many-body atomic states in optical lattices by numerical simulations based on the Bayesian method of Gaussian process regression (GPR). We combine this method with the time evolving block decimation (TEBD) algorithm for the preparation of the Heisenberg antiferromagnetic state in a system of bosonic atoms confined with a one-dimensional optical lattice. The quantum many-body ground state of 80 atoms is efficiently optimized within a few hundred machine learning iterations, reaching a state fidelity above $96%$. With a multistep learning strategy, we find the machine-learning-based optimal control method is scalable to large systems owing to its transferability. Its robustness against noise is demonstrated by considering imperfections that are typically present in optical lattice experiments. In the application of the GPR method to the preparation of the two-dimensional (2D) antiferromagnetic state, a state fidelity of $94%$ is reached for a 2D array of $6\ifmmode\times\else\texttimes\fi{}6$ spins through the time dependent variational principle (TDVP) algorithm, confirming the generalizability of our method. We further optimize the Hamiltonian ramping sequence crossing a quantum phase transition of the quantum $XXZ$ spin chain with the exact diagonalization method, from which a control protocol for generating the long-sought atomic Greenberger-Horne-Zeilinger state is obtained. We believe the proposed optimal control method for quantum Hamiltonian ground-state preparation would benefit present ultracold-atom experiments in the study of strongly correlated many-body physics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call