Abstract

In the quest to reach lower temperatures of ultracold gases in optical-lattice experiments, nonadiabaticities during lattice loading represent one of the limiting factors that prevent the same low temperatures being reached as in experiments without lattices. Simulating the loading of a bosonic quantum gas into a one-dimensional optical lattice with and without a trap, we find that the redistribution of atomic density inside a global confining potential is by far the dominant source of heating. Based on these results we propose adjusting the trapping potential during loading to minimize changes to the density distribution. Our simulations confirm that a very simple linear interpolation of the trapping potential during loading already significantly decreases the heating of a quantum gas, and we discuss how loading protocols minimizing density redistributions can be designed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call