Abstract

The self-trapping phenomenon of Bose-Einstein condensates (BECs) in optical lattices is studied extensively by numerically solving the Gross-Pitaevskii equation. Our numerical results not only reproduce the phenomenon that was observed in a recent experiment [Anker {\it et al.}, Phys. Rev. Lett. {\bf 94} (2005)020403], but also find that the self-trapping breaks down at long evolution times, that is, the self-trapping in optical lattices is only temporary. The analysis of our numerical results shows that the self-trapping in optical lattices is related to the self-trapping of BECs in a double-well potential. A possible mechanism of the formation of steep edges in the wave packet evolution is explored in terms of the dynamics of relative phases between neighboring wells.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call