Abstract

ABSTRACT In recent years, with the continuous advancement of remote sensing technology, remote sensing images have found widespread application in various fields. However, remote sensing images are often hindered by the presence of clouds, which can significantly reduce their usability. Removing clouds from remote sensing images has thus become an essential preprocessing step in remote sensing image analysis. While numerous models based on deep learning have produced impressive results in the domain of image denoising, there has been limited research on applying deep neural networks to tackle the issue of cloud removal in remote sensing images. This paper proposes a novel model, the Hybrid Attention Generative Adversarial Network (HyA-GAN), that blends the channel attention mechanism and the spatial attention mechanism and integrates it into a generative adversarial network. By incorporating both the channel attention mechanism and the spatial attention mechanism, HyA-GAN permits the network to prioritize regions that are vital for image processing. This feature leads to an improvement in the network’s capacity to recover data and generate remote sensing images with superior cloud removal effects. When comparing HyA-GAN with other existing cloud removal models on the RICE dataset, it outperformed them in both peak signal-to-noise ratio (PSNR) and structural similarity index (SSIM). The experiments demonstrate the high potential of HyA-GAN in addressing the challenge of removing clouds from remote sensing images.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call