Abstract

AbstractHeat has been used as a tracer to identify and quantify groundwater inflows into streams. Over the last decade, a few methods have used fiber‐optic distributed temperature sensing to facilitate assessment of such inflows into small streams. However, these methods focused mainly on the groundwater percentage and the thermal contrast between groundwater and surface water without considering the flow regime of the surface water. In this study, artificial water inflows into a controlled flume were examined using fiber‐optic distributed temperature sensing to quantify the thermal anomalies induced as a function of the flow regime (turbulent or laminar). Computer simulations were then performed to widen the range of the parameters tested and provide insight into the physical processes involved. Experiments conducted under the turbulent regime were in accordance with results and uncertainties of previous studies. Under the laminar regime, however, the inflow‐induced thermal anomalies were always smaller than those under the turbulent regime for a given inflow percentage. Therefore, the actual inflow percentage may be underestimated when using a classic method under a laminar regime.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call