Abstract

An analytical radiometric thermometry model applied to predict quantitatively and correct for gaseous emission and absorption effects on spectral-band and ratio thermometer readings is presented. This model using numerical radiative transfer simulations takes into account the augmentation and attenuation of radiation in gas medium. The temperature predictions and corrections agree well with a series of measured data and the results of current best correction algorithms, which reveal that this model has good applicability in dealing with direct problems and inverse problems in radiation thermometry. The measurement uncertainty, along with the uncertainty component due to gaseous emission and absorption effects, for a thermometer interfered with absorptive gas is quite strongly dependent on the thermometer operating wavelength. If intervening gas is optically thin, the measured temperature errors of spectral-band and ratio thermometers will vary linearly with gas concentration or optical path length. Whereas for non-optically thin situations, the measured temperature errors of spectral-band thermometers vary as a second-order exponential function and the errors of ratio thermometers increase approximately as a biphasic hill function, with either increasing gas concentration or optical path length. Based on the theoretical calculations of present model, three new correction algorithms to eliminate the influence of gaseous absorption and emission are proposed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call