Abstract

The baseline radiometer brightness temperature (Tb) downscaling algorithm for NASA's Soil Moisture Active Passive (SMAP) mission, scheduled for launch in January 2015, is tested using an airborne simulation of the SMAP data stream. The algorithm synergistically uses 3km Synthetic Aperture Radar (SAR) backscatter (σ) to downscale a 36km radiometer Tb to 9km. While the algorithm has already been tested using experimental datasets from field campaigns in the USA, it is imperative that it is tested for a comprehensive range of land surface conditions (i.e. in different hydro-climatic regions) before global application. Consequently, this study evaluates the algorithm using data collected from the Soil Moisture Active Passive Experiments (SMAPEx) in south-eastern Australia, that closely simulate the SMAP data stream for a single SMAP radiometer pixel over a 3-week interval, with repeat coverage every 2–3days. The results suggest that the average root-mean-square error (RMSE) in downscaled Tb is 3.1K and 2.6K for h- and v-polarizations respectively, when downscaled to 9km resolution. This increases to 8.2K and 6.6K when applied at 1km resolution. Downscaling over the relatively homogeneous grassland areas resulted in 2K lower RMSE than for the heterogeneous cropping area. Overall, the downscaling error was around 2.4K when applied at 9km resolution for five of the nine days, which meets the 2.4K error target of the SMAP mission.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call