Abstract
The use of an acousto-optic modulator in an interferometer enables heterodyne detection and can alleviate technical issues such as laser intensity noise and photodetector flicker noise. However, it also introduces at least a 3 dB penalty to the shot-noise limited phase sensitivity when compared to that attainable through homodyne detection. In this paper, we show theoretically that this penalty can be lifted by implementing detection and demodulation schemes that exploit the cyclostationary nature of shot noise observed at the outputs of the self-heterodyne interferometer. We also discuss how expected departures from ideality, namely imperfect interferometric visibility, non-zero stationary noise, and finite detection bandwidth, affect the performance of these schemes and propose a simplified procedure relaxing the requirement for a rigorous instrument characterization. While two independent detectors are required to reach the true (classical-light) shot-noise limit for phase, single-detector instruments can also benefit from the introduced ideas to achieve up to 3 dB improvement in phase sensitivity.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.