Abstract
The phase uncertainty of an unseeded nonlinear interferometer, where the output of one nonlinear crystal is transmitted to the input of a second crystal that analyzes it, is commonly said to be below the shot-noise level but highly dependent on detection and internal loss. Unbalancing the gains of the first (source) and second (analyzer) crystals leads to a configuration that is tolerant against detection loss. However, in terms of sensitivity, there is no advantage in choosing a stronger analyzer over a stronger source, and hence the comparison to a shot-noise level is not straightforward. Internal loss breaks this symmetry and shows that it is crucial whether the source or analyzer is dominating. Based on these results, claiming a Heisenberg scaling of the sensitivity is more subtle than in a balanced setup.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.