Abstract

The sensitivity of an ideal heterodyne spectrometer approaches the quantum detection limit provided the local oscillator power is sufficiently large and the shot noise dominates all other sources of noise. The postintegration minimum-detectable-number of photons/sec for an ideal heterodyne system is (B/tau)((1/2)), where B is the IF bandwidth, and tau is the integration time. For astronomical observations, however, a number of factors (Delta(i)) tend to degrade the sensitivity, a fact that becomes significant particularly when the laser power is insufficient. A discussion and an evaluation of the degradation in sensitivity are given for a heterodyne spectrometer employing a HgCdTe photodiode mixer and tunable diode lasers. The minimum detectable source brightness is considered as a function of the mixer parameters, transmission coefficient of the beam splitter, and local oscillator emission powers. The degradation in the minimum detectable line source brightness that results from the bandwidth being a fraction of the line width is evaluated and plotted as a function of the wavelength and bandwidth for various temperature to mass ratios. It is shown that the minimum achievable degradation [pi(i)(Delta(i))] in the sensitivity of a practical astronomical heterodyne spectrometer is ~30. Estimates of SNR's with which ir line emission from astronomical sources of interest may be detected are given.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.