Abstract

The problem of detecting a completely known coherent optical signal in a thermal background radiation is considered. The problem is a quantum mechanical analog of detection of a known signal in Gaussian noise. The quantum detection counterpart is formulated in terms of a pair of density operators and a solution is shown to exist. A perturbation solution is obtained by making use of a reproducing kernel Hilbert space of entire functions. The solution is particularly applicable to optical frequencies, where the effect of thermal radiation is small, and it is shown to converge to known results at zero thermal radiation. Curves are generated showing the detectability limit at optical frequencies. Also considered is the problem of finding an operator that maximizes a signal-to-noise ratio, defined for quantum detection in analogy with the classical theory. For a coherent signal with random phase, the operator that maximizes the signal-to-noise ratio is identicial to the one obtained by applying the Neyman Pearson criterion, thereby establishing a complete analogy with the classical detection theory. For a signal with known phase, however, the analogy breaks down in the limit of zero thermal radiation. In that case, it is shown that an operator that maximizes the “classical” signal-to-noise ratio does not exist.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.