Abstract

An investigation into the noise performance of optical lock-in thermography (OLT) is described. The study aims to clarify the influence of infrared detector type and key inspection parameters such as illumination strength and lock-in duration on the quality of OLT amplitude and phase imagery. The study compares the performance of a state-of-the-art cooled photon detector with several lower-cost microbolometers. The results reveal a significant noise performance advantage to the photon detector. Under certain inspection regimes the advantage with respect to phase image quality is disproportionately high relative to detector sensitivities. This is shown to result from an explicit dependence in the phase signal variance on the ratio between the signal amplitude and the detector sensitivity. While this finding supports the preferred use of photon detectors for OLT inspections, it does not exclude microbolometers from a useful role. In cases where the significantly lower capital cost and improved practicality of microbolometers provide an advantage it is shown that performance shortfalls can be overcome with a relatively small factorial increase in optical illumination intensity.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.