Abstract

Thermal insulation test methods approach their lower limits as thermal resistance falls below 0.1 m2⋅K/W. This is the minimum value specified in ASTM C 518 (ASTM International, 2010b) while ASTM C 177 (ASTM International, 2010a) proposes about 0.06 m2⋅K/W. Nevertheless these are the test methods, along with their ISO equivalents, required by Australasian building codes and directed at many products and materials with thermal resistance on the low side of 0.1 m2⋅K/W. Alternatives, such as ASTM E 1530 (ASTM International, 2011), cover much lower resistances but require carefully prepared small specimens and very-high contact pressures and are therefore largely unsuitable for both technical and compliance reasons. For these low resistances, the insulation test methods face large errors because of interface resistance between specimen and the apparatus hot and cold plates. Staying with C 518, the problem can be avoided by using direct measurement of the test specimen surface temperatures, but this is difficult, has its own accuracy issues, and is often impractical for commercial laboratories. This technique is generally used in conjunction with interface materials such as flexible foam between the specimen and the hot and cold plates, to enhance contact and also provide an access path for temperature sensors. The alternative prospect of using these interface materials to ensure good specimen contact has been studied, in conjunction with a simple two-step thermal resistance determination based on the difference between presence and absence of the test specimen. This article presents results of a study using this difference approach for the measurement of 12 highly conducting materials, including sheets of aluminum, phenolic, HDPE, MgO, bonded rubber and cork granules, PMMA, and compressed wood fiber.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call