A novel algorithm is developed to downscale soil moisture (SM), obtained at satellite scales of 10–40 km to 1 km by utilizing its temporal correlations to historical auxiliary data at finer scales. Including such correlations drastically minimizes the size of the training set needed, accounts for time-lagged relationships, and enables downscaling even in the presence of short gaps in the auxiliary data. The algorithm is based upon bagged regression trees (BRT) and uses correlations between high-resolution remote sensing products and SM observations. The algorithm trains multiple RTs and automatically chooses the trees that generate the best downscaled estimates. The algorithm was evaluated using a multiscale synthetic data set in north central Florida for two years, including two growing seasons of corn and one growing season of cotton per year. The time-averaged error across the region was found to be 0.01 m3/m3, with a standard deviation of 0.012 m3/m3 when 0.02% of the data were used for training in addition to temporal correlations from the past seven days, and all available data from the past year. The maximum spatially averaged errors obtained using this algorithm in downscaled SM were 0.005 m3/m3, for pixels with cotton land cover. When land surface temperature (LST) on the day of downscaling was not included in the algorithm to simulate “data gaps,” the spatially averaged error increased minimally by 0.015 m3/m3 when LST is unavailable on the day of downscaling. The results indicate that the BRT-based algorithm provides high accuracy for downscaling SM using complex nonlinear spatiotemporal correlations, under heterogeneous micrometeorological conditions.
Read full abstract