Abstract

Midwave infrared systems with cooled detectors are generally used for high-precision or quantitative measurement, such as radiometry and thermometry. As a basis of these applications, radiometric calibration aims to obtain the relationship between the infrared images and the incident radiant flux generated by the scene or targets. Conventional radiometric calibration algorithms do not take the influences of integration and ambient temperature into consideration. As a consequence, the accuracy of calibration deteriorates whenever the temperature or the integration time varies. To solve this problem, we analyzed the effects of integration time and ambient temperature on coefficients of the radiometric calibration formula by theoretical and experimental analysis. Then, a radiometric calibration method is deduced to remove the variation of integration time and ambient temperature on the accuracy of calibration and radiometry. Several radiometric calibration experiments were conducted using a midwave infrared camera inside a chamber with controllable temperature. The results indicate that the proposed calibration algorithm is more effective and accurate, compared with conventional calibration methods, in complicated working conditions with variable integration times and ambient temperatures.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call