Abstract

Many modern millimeter and submillimeter ("mm-wave") telescopes for astronomy are deploying more detectors by increasing the detector pixel density and, with the rise of lithographed detector architectures and high-throughput readout techniques, it is becoming increasingly practical to overfill the focal plane. However, when the pixel pitch p p i x is small compared to the product of the wavelength λ and the focal ratio F, or p p i x ≲1.2F λ, the Bose term of the photon noise correlates between neighboring detector pixels due to the Hanbury Brown and Twiss (HBT) effect. When this HBT effect is non-negligible, the array-averaged sensitivity scales with the detector count N det less favorably than the uncorrelated limit of Ndet-1/2. In this paper, we present a general prescription to calculate this HBT correlation based on a quantum optics formalism and extend it to polarization-sensitive detectors. We then estimate the impact of HBT correlations on the sensitivity of a model mm-wave telescope and discuss the implications for a focal plane design.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call