Abstract

Many spatiotemporal image fusion methods in remote sensing have been developed to blend highly resolved spatial images and highly resolved temporal images to solve the problem of a trade-off between the spatial and temporal resolution from a single sensor. Yet, none of the spatiotemporal fusion methods considers how the various temporal changes between different pixels affect the performance of the fusion results; to develop an improved fusion method, these temporal changes need to be integrated into one framework. Adaptive-SFSDAF extends the existing fusion method that incorporates sub-pixel class fraction change information in Flexible Spatiotemporal DAta Fusion (SFSDAF) by modifying spectral unmixing to select spectral unmixing adaptively in order to greatly improve the efficiency of the algorithm. Accordingly, the main contributions of the proposed adaptive-SFSDAF method are twofold. One is to address the detection of outliers of temporal change in the image during the period between the origin and prediction dates, as these pixels are the most difficult to estimate and affect the performance of the spatiotemporal fusion methods. The other primary contribution is to establish an adaptive unmixing strategy according to the guided mask map, thus effectively eliminating a great number of insignificant unmixed pixels. The proposed method is compared with the state-of-the-art Flexible Spatiotemporal DAta Fusion (FSDAF), SFSDAF, FIT-FC, and Unmixing-Based Data Fusion (UBDF) methods, and the fusion accuracy is evaluated both quantitatively and visually. The experimental results show that adaptive-SFSDAF achieves outstanding performance in balancing computational efficiency and the accuracy of the fusion results.

Highlights

  • Earth observation missions have played an important role in coping with global changes and solving many problems and challenges related to the development of human society

  • The Flexible Spatiotemporal DAta Fusion (FSDAF) [11] first estimated the temporal changes of endmembers in a scene based on the spatial unmixing method to describe gradual phenology changes, used spatial interpolation to characterize sudden land cover type changes, and performed residual compensation based on a weighted function of similar pixels

  • We introduce the theory and steps of adaptive-SFSDAF in Section 2 and describe the experiments and results in Sections 3 and 4

Read more

Summary

Introduction

Earth observation missions have played an important role in coping with global changes and solving many problems and challenges related to the development of human society. The spatial and temporal adaptive reflectance fusion model (STARFM) is the one of the earliest to establish the satellite fusion model [1], which is simple, flexible and the most widely used This method assumes that the land cover type in the coarse pixel does not change over the prediction period, so the performance degrades somewhat when used on landscapes with high heterogeneity. The Flexible Spatiotemporal DAta Fusion (FSDAF) [11] first estimated the temporal changes of endmembers in a scene based on the spatial unmixing method to describe gradual phenology changes, used spatial interpolation to characterize sudden land cover type changes, and performed residual compensation based on a weighted function of similar pixels. Based on FSDAF, an enhanced fusion method incorporates sub-pixel class fraction change information in Flexible Spatiotemporal DAta Fusion (SFSDAF) [39], resulting in better performance when applied to landscapes with many mixed pixels and land cover type changes.

Methods
Fine Image Spectral Unmixing at T1
Spectral Unmixing to Fine Pixels at T2
Estimation of the Coarse-Resolution Endmember at T2
Estimation of the Fine-Resolution Endmember at T2
Estimation of the Coarse-Resolution Abundance at T2
Estimation of the Fine-Resolution Abundance at T2
Residual Compensation for the Temporal Prediction Image at T2
Study Area and Data
Comparison and Evaluation
Test Using the Gwydir Dataset with Land Cover Type Change
Comparison of Computation Times
Findings
Discussion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.