Abstract

AbstractThe utility of hydrologic land surface models (LSMs) can be enhanced by using information from observational platforms, but mismatches between the two are common. This study assesses the degree to which model agreement with observations is affected by two mechanisms in particular: 1) physical incongruities between the support volumes being characterized and 2) inadequate or inconsistent parameterizations of physical processes. The Noah and Noah-MP LSMs by default characterize surface soil moisture (SSM) in the top 10 cm of the soil column. This depth is notably different from the 5-cm (or less) sensing depth of L-band radiometers such as NASA’s Soil Moisture Active Passive (SMAP) satellite mission. These depth inconsistencies are examined by using thinner model layers in the Noah and Noah-MP LSMs and comparing resultant simulations to in situ and SMAP soil moisture. In addition, a forward radiative transfer model (RTM) is used to facilitate direct comparisons of LSM-based and SMAP-based L-band Tb retrievals. Agreement between models and observations is quantified using Kolmogorov–Smirnov distance values, calculated from empirical cumulative distribution functions of SSM and Tb time series. Results show that agreement of SSM and Tb with observations depends primarily on systematic biases, and the sign of those biases depends on the particular subspace being analyzed (SSM or Tb). This study concludes that the role of increased soil layer discretization on simulated soil moisture and Tb is secondary to the influence of component parameterizations, the effects of which dominate systematic differences with observations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call