Abstract

Land surface models (LSMs) must accurately simulate observed energy and water fluxes during droughts in order to provide reliable estimates of future water resources. We evaluated 8 different LSMs (14 model versions) for simulating evapotranspiration (ET) during periods of evaporative drought (Edrought) across six flux tower sites. Using an empirically defined Edrought threshold (a decline in ET below the observed 15th percentile), we show that LSMs simulated 58 Edrought days per year, on average, across the six sites, ∼3 times as many as the observed 20 d. The simulated Edrought magnitude was ∼8 times greater than observed and twice as intense. Our findings point to systematic biases across LSMs when simulating water and energy fluxes under water-stressed conditions. The overestimation of key Edrought characteristics undermines our confidence in the models’ capability in simulating realistic drought responses to climate change and has wider implications for phenomena sensitive to soil moisture, including heat waves.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call