ABSTRACT The Sentinel-2 multispectral (S2-MS) images, equipped with three red-edge (Red-E) bands, serve as an optimal data source for vegetation monitoring. However, its spatial resolution of 10–20 m restricts greatly its utility for local, precise monitoring. The widely used consumer-grade unmanned aerial vehicle (UAV) provides much finer spatial resolution images but typically only in the visible and near-infrared spectral bands. UAV and S2-MS images have strong complementarity in spatial, temporal, and spectral resolution. This paper establishes a spatio-temporal-spectral (STS) fusion framework for downscaling S2-MS images using UAV images. First, the spatio-temporal (ST) fusion method of Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) is applied to the spatio-spectral (SS) fusion of UAV and S2-MS images, and it is verified to perform better than the existing SS fusion methods and exhibits robustness across spatial scales. Then, CA-STARFM is generated by coupling STARFM with Consistent Adjustment of the Climatology to Actual Observations (CACAO) and used to further optimize SS fusion results, yielding more competent performance. Moreover, the applicability of CA-STARFM to STS fusion is further verified based on the UAVlike image generated by the ST fusion of UAV and S2-MS images. The results indicate that STARFM is competent for SS fusion at large spatial scales, while CA-STARFM can not only optimize the ST fusion of UAV and satellite images but also be promising for SS fusion. Therefore, the proposed fusion framework provides a potential solution to integrate spatial, temporal, and spectral information of UAV and S2-MS images for precise monitoring.
Read full abstract