In designing thermoelectric and heat transfer devices on the micrometer scale, the accurate thermal conductivity measurement is very important, and a variety of measurement methods have been developed so far. Among them, the 3ω method is one of the best for conductive wires because it can directly measure thermal conductivity without measuring density or specific heat, and also in the same direction as electrical or thermoelectric property. However, previous studies have not sufficiently considered the effects of ambient pressure and the conductive adhesive used to attach the sample to the electrode, which may hinder accurate measurement. In this study, using a thin gold wire as a test sample, the influence of ambient pressure and the length of conductive adhesive along the sample has been investigated quantitatively as major factors of systematic errors in the 3ω method. When the pressure was increased in the transition flow region, the measured apparent thermal conductivity increased. An analytical model for the low-pressure gas heat conduction is proposed to quantitatively explain the pressure dependence. The measured value also increased when the length of the conductive adhesive exceeded 20% of the sample length. This work has revealed that the ambient should be evacuated to the molecular flow region and the length of conductive adhesive be less than 20% of the sample length. The guidelines proposed here will help researchers in various fields to more accurately determine the thermal conductivity of micrometer-scale wires.
Read full abstract