Abstract
A theoretical analysis is made of a microwave radiometer which utilizes no predetection amplification. The minimum detectable temperature, defined as the temperature which results in an output signal-to-noise ratio of unity, is shown to be (π/BM) (T0/2Kτ0)1/2 where B is the predetection bandwidth, M is the crystal figure of merit, T0 is the ambient temperature, K is Boltzmann's constant, and τ0 is the postdetection time constant. The predicted minimum detectable temperature for the millimeter wavelength region is about 50°K for a one-second postdetection time constant. Some experimental measurements are presented which show good agreement with the theoretical results. Because of the absence of predetection amplification, the sensitivity of the crystal-video radiometer is relatively poor; however, it is quite adequate for those applications where large temperatures are expected, such as gaseous discharge research. The simplicity of the crystal-video radiometer and its adaptability to complete solid-state instrumentation make it quite attractive for radiometric measurements from space probes and satellites.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have