Abstract
Remote sensing satellites provide observations of the Earth’s surface, which are crucial data for applications and analyses in several fields, including agriculture, environmental protection, and sustainable development. However, the wide and frequent occurrence of clouds highly undermines the quality and availability of usable optical data, particularly low-temporal-resolution data. Although deep learning techniques have facilitated recent progress in cloud removal algorithms, thick cloud removal under changing land cover remains challenging. In this study, we propose a framework to remove thick clouds, thin clouds, and cloud shadow from Sentinel-2 images. The framework integrates the spatial detail in a Sentinel-2 reference image and the coarse spectral pattern in a near-target-date Sentinel-3 image as spatiotemporal guidance to generate missing data with land cover change information in a cloudy Sentinel-2 image. The reconstruction is performed using a spatiotemporal attention network (STAN) that adopts the self-attention mechanism, residual learning, and high-pass features to enhance feature extraction from the multisource data. The experimental results show that STAN outperforms residual u-net (ResUnet), cloud-removal network (CRN), convolutional neural network-based spatial–temporal–spectral (STS-CNN), and DSen2-CR in terms of multiple quantitative metrics and visual characteristics. The comparative experiment proves that the integration of Sentinel-3 data improves the cloud removal performance, especially in areas with distinctive and heterogeneous land cover changes under large-scale cloud cover. The experimental results also indicate high generalizability of STAN when the Sentinel-3 image is far from the target date, when transferring features to cloud removal for new images, and even with limited training data that simulates severe cloud cover.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Geoscience and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.