Modern remote temperature sensing in the form of infrared imaging has become a widely used and important technique, with the ability to measure and characterize important but unseen Radiant heat. As more Long-wave Infrared (LWIR) detectors come to market aiming to meet a wide array of needs and goals, there is a need to differentiate and appraise LWIR detectors based upon the specific needs of thermal comfort research. While most detectors measure in the range of 8-14 μm, only 37.6% of the emitted energy of a blackbody at 300 K is contained in this spectral range. Thus, inherent to the operation of nearly all infrared detectors and cameras is an assumption about the emission curves of the objects sensed. Many materials in the built environment deviate significantly from the blackbody assumption, and the error due to this deviation is one which generic gray-body emissivity corrections are unable to fix — it is akin to taking black-and-white images with only the red channel of a camera, and using exposure compensation to correct the image to attempt a true monochrome rendition: there is simply missing information and the adjusted image will still be very clearly wrong. In this paper, we aim to evaluate by simulation and experiment the potential errors in infrared thermography used to drive thermal comfort heat transfer calculations due to intrinsic spectral assumptions of LWIR detectors.
Read full abstract