Abstract

A scheme for calculating thermally averaged observables for quantum dissipative systems is presented. The method is based on a wavefunction with equal amplitude and random phase composed of a complete set of states, which is then propagated in imaginary time β/2. Application to a Surrogate Hamiltonian simulation of a molecule subject to an ultrafast pulse coupled to a bath is studied. Compared to Boltzmann thermal averaging the method scales more favorably with an increase in the number of bath modes. A self-averaging phenomenon was identified which reduces the number of random sets required to converge the thermal average.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call