Abstract

Regional-scale estimation of soil moisture using in situ field observations is not possible due to problems with the representativeness of the sampling and costs. Remotely sensed satellite data are helpful in this regard. Here, the simulations of 19- and 37-GHz vertical and horizontal polarization brightness temperatures and estimation of soil moistures using data from the Special Sensor Microwave/Imager (SSM/I) for 798 0.25 83 0.258 boxes in the southwestern plains region of the United States for the time period between 1 August 1987 and 31 July 1988 are presented. A coupled land-canopy‐atmosphere model is used for simulating the brightness temperatures. The land-surface hydrology is modeled using a thin-layer hydrologic model. The canopy scattering is modeled using a radiative transfer model, and the atmospheric attenuation is characterized using an empirical model. The simulated brightness temperatures are compared with those observed by the SSM/I sensor aboard the Defense Metereological Satellite Program satellite. The observed brightness temperatures are used to derive the soil moistures using the canopy radiative transfer and atmospheric attenuation model. The discrepancies between the SSM/I-based estimates and the simulated soil moisture are discussed. The mean monthly soil moistures estimated using the 19-GHz SSM/I brightness temperature data are interpreted along with the mean monthly leaf area index and accumulated rainfall. The soil moistures estimated using the 19-GHz SSM/I data are used in conjunction with the hydrologic model to estimate cumulative monthly evaporation. The results of the simulations hold promise for the utilization of microwave brightness temperatures in hydrologic modeling for soil moisture estimation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call