Abstract

Quantum control could be implemented by varying the system Hamiltonian. According to adiabatic theorem, a slowly changing Hamiltonian can approximately keep the system at the ground state during the evolution if the initial state is a ground state. In this paper we consider this process as an interpolation between the initial and final Hamiltonians. We use the mean value of a single operator to measure the distance between the final state and the ideal ground state. This measure could be taken as the error of adiabatic approximation. We prove under certain conditions, this error can be precisely estimated for an arbitrarily given interpolating function. This error estimation could be used as guideline to induce adiabatic evolution. According to our calculation, the adiabatic approximation error is not proportional to the average speed of the variation of the system Hamiltonian and the inverse of the energy gaps in many cases. In particular, we apply this analysis to an example on which the applicability of the adiabatic theorem is questionable.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.