Abstract
Midwave infrared systems with cooled detectors are generally used for high-precision or quantitative measurement, such as radiometry and thermometry. As a basis of these applications, radiometric calibration aims to obtain the relationship between the infrared images and the incident radiant flux generated by the scene or targets. Conventional radiometric calibration algorithms do not take the influences of integration and ambient temperature into consideration. As a consequence, the accuracy of calibration deteriorates whenever the temperature or the integration time varies. To solve this problem, we analyzed the effects of integration time and ambient temperature on coefficients of the radiometric calibration formula by theoretical and experimental analysis. Then, a radiometric calibration method is deduced to remove the variation of integration time and ambient temperature on the accuracy of calibration and radiometry. Several radiometric calibration experiments were conducted using a midwave infrared camera inside a chamber with controllable temperature. The results indicate that the proposed calibration algorithm is more effective and accurate, compared with conventional calibration methods, in complicated working conditions with variable integration times and ambient temperatures.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.