Abstract

Platinum resistance thermometers (PRTs) are capable of providing reliable measurements at the millikelvin level, and are widely used in both industry and research applications. However, the intrinsic thermal noise associated with their resistance requires the use of a measurement current of typically around a milliampere to determine their resistance. Unfortunately, this same current also dissipates heat into the thermometer element, causing the well-known “self-heating” effect of typically a few millikelvins. Performing measurements in terms of the ratio to the resistance at the ice point provides some level of cancelation of this error around this temperature: If the thermal resistance between the sensor and environment were constant, this cancelation would work over a much wider temperature range. However, there is little evidence on the effectiveness of this strategy in practice. This paper reports on an extensive set of systematic measurements of the self-heating of six standard platinum resistance thermometers (SPRTs) and six industrial platinum resistance thermometers (IPRTs) of different designs, as a function of temperature, over the range from $$-190~^{\circ }\mathrm{C}$$ to $$420\,^{\circ }\mathrm{C}$$ , in a range of intercomparison baths and blocks. The measurements show that PRT self-heating varies from being almost constant with temperature to being nearly proportional to temperature. The assumption of a roughly temperature-independent thermal resistance is thus not justified in general. The results allow estimation of appropriate uncertainty terms for SPRT and IPRT self-heating for the two scenarios of “working in $$R$$ ” and “working in $$W$$ .”

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call