Abstract
The self-trapping phenomenon of Bose-Einstein condensates (BECs) in optical lattices is studied extensively by numerically solving the Gross-Pitaevskii equation. Our numerical results not only reproduce the phenomenon that was observed in a recent experiment [Anker {\it et al.}, Phys. Rev. Lett. {\bf 94} (2005)020403], but also find that the self-trapping breaks down at long evolution times, that is, the self-trapping in optical lattices is only temporary. The analysis of our numerical results shows that the self-trapping in optical lattices is related to the self-trapping of BECs in a double-well potential. A possible mechanism of the formation of steep edges in the wave packet evolution is explored in terms of the dynamics of relative phases between neighboring wells.
Submitted Version (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.