Abstract

Rock surface temperatures are of fundamental importance for studies on weathering, rockfall and permafrost. Point temperature measurements may not reflect the small-scale variability of temperature as a function of micro-topography. To close this gap, infrared thermography (IRT) seems to be a simple and promising approach. However, there are several pitfalls in terms of interpretation, as radiation temperatures depend on emissivity and reflectivity of the rock, which in turn are influenced by rock type, surface roughness, wetness, surrounding weather conditions, and angle to the camera axis.We performed laboratory and exemplary field experiments in order to estimate the magnitude of possible errors. We used rough and smooth (sawn) specimen of six different stone types in wet and dry condition and took IRT images at different tilt angles between camera axis and rock surface. Furthermore, we applied the approach to a small rock outcrop (approx. 3 × 3 m) and to a rockwall (approx. 100 × 100 m).The results of the laboratory measurements show that the temperature error increases with increasing tilt angle of the rock surface. Depending on the nature of the reflected surroundings, radiation temperatures can be warmer or cooler than sensor temperatures. In typical settings, the error is low (<0.5 K) up to a tilt of 40° but it may increase to >1 K at tilt angles of 50° and more. Smooth and wet surfaces tend to be more prone to deviations.The field examples confirm the results of the laboratory tests. But they show that spatial differences in temperature can still be detected as the “true” differences are usually larger than the magnitude of error. We suggest to reduce the error of the IRT image by correcting temperatures using a high-resolution surface model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call