Abstract

We present a quantum algorithm for the dynamical simulation of time-dependent Hamiltonians. Our method involves expanding the interaction-picture Hamiltonian as a sum of generalized permutations, which leads to an integral-free Dyson series of the time-evolution operator. Under this representation, we perform a quantum simulation for the time-evolution operator by means of the linear combination of unitaries technique. We optimize the time steps of the evolution based on the Hamiltonian’s dynamical characteristics, leading to a gate count that scales with an L1-norm-like scaling with respect only to the norm of the interaction Hamiltonian, rather than that of the total Hamiltonian. We demonstrate that the cost of the algorithm is independent of the Hamiltonian’s frequencies, implying its advantage for systems with highly oscillating components, and for time-decaying systems the cost does not scale with the total evolution time asymptotically. In addition, our algorithm retains the near optimal log(1/ϵ)/loglog(1/ϵ) scaling with simulation error ϵ.Received 21 April 2021Accepted 25 August 2021DOI:https://doi.org/10.1103/PRXQuantum.2.030342Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.Published by the American Physical SocietyPhysics Subject Headings (PhySH)Research AreasQuantum algorithmsQuantum computationQuantum simulationQuantum Information

Highlights

  • The problem of simulating quantum systems, whether it is to study their dynamics, or to infer their salient equilibrium properties, was the original motivation for quantum computers [1] and remains one of their major potential applications [2,3]

  • This is in stark contrast to existing algorithms that have dependence on ||dH (t)/dt||, which grows with oscillation rates. Another class of Hamiltonians for which our algorithm is preferred over others is those with exponential decays in time. For these systems, our algorithms require asymptotically a finite number of steps that does not scale with the evolution time, leading in turn to an exponential saving comparing to the linear scaling in existing approaches

  • We have presented a quantum algorithm for simulating the evolution operator generated from a time-dependent Hamiltonian

Read more

Summary

INTRODUCTION

The problem of simulating quantum systems, whether it is to study their dynamics, or to infer their salient equilibrium properties, was the original motivation for quantum computers [1] and remains one of their major potential applications [2,3]. While in the time-independent case the Schrödinger equation can be formally integrated, the time-evolution unitary operator for time-dependent systems is given in terms of a Dyson series [18]—a perturbative expansion, wherein each summand is a multidimensional integral over a time-ordered product of the (usually interaction-picture) Hamiltonian at different points in time. These time-ordered integrals pose multiple algorithmic and implementation challenges.

PERMUTATION EXPANSION METHOD FOR TIME-DEPENDENT HAMILTONIANS
TIME-DEPENDENT HAMILTONIAN SIMULATION ALGORITHM
An overview of the algorithm
The LCU routine
Time partitioning and the number of time steps
State preparation
Implementation of the controlled unitaries
Algorithm cost
The cost for the state preparation and the controlled unitaries
Overall cost of the algorithm
Example advantages of the algorithm
Hamiltonians with arbitrary time dependence
ALTERNATIVE SCHEME AND REDUCTION TO THE TIME-INDEPENDENT CASE
CONCLUSIONS
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call