Abstract
We consider a fully quantized model of spontaneous emission, scattering, and absorption, and study propagation of a single photon from an emitting atom to a detector atom both with and without an intervening scatterer. We find an exact quantum analog to the classical complex analytic signal of an electromagnetic wave scattered by a medium of charged oscillators. This quantum signal exhibits classical phase delays. We define a time of detection which, in the appropriate limits, exactly matches the predictions of a classically defined delay for light propagating through a medium of charged oscillators. The fully quantized model provides a simple, unambiguous, and causal interpretation of delays that seemingly imply speeds greater than c in the region of anomalous dispersion.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Journal of Optics B: Quantum and Semiclassical Optics
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.