Abstract
Due to technological limitations and budget constraints, spatiotemporal fusion is considered a promising way to deal with the tradeoff between the temporal and spatial resolutions of remote sensing images. Furthermore, the generative adversarial network (GAN) has shown its capability in a variety of applications. This article presents a remote sensing image spatiotemporal fusion method using a GAN (STFGAN), which adopts a two-stage framework with an end-to-end image fusion GAN (IFGAN) for each stage. The IFGAN contains a generator and a discriminator in competition with each other under the guidance of the optimization function. Considering the huge spatial resolution gap between the high-spatial, low-temporal (HSLT) resolution Landsat imagery and the corresponding low-spatial, high-temporal (LSHT) resolution MODIS imagery, a feature-level fusion strategy is adopted. Specifically, for the generator, we first super-resolve the MODIS images while also extracting the high-frequency features of the Landsat images. Finally, we integrate the features from the MODIS and Landsat images. STFGAN is able to learn an end-to-end mapping between the Landsat-MODIS image pairs and predicts the Landsat-like image for a prediction date by considering all the bands. STFGAN significantly improves the accuracy of phenological change and land-cover-type change prediction with the help of residual blocks and two prior Landsat-MODIS image pairs. To examine the performance of the proposed STFGAN method, experiments were conducted on three representative Landsat-MODIS data sets. The results clearly illustrate the effectiveness of the proposed method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Geoscience and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.