Abstract

In agricultural remote sensing monitoring, the quality of optical remote sensing image data acquisition is often affected by climate, and the acquired satellite imagery results usually contain cloud information, which can lead to a lack of ground data information. Unlike thick clouds, the semi-transparent nature of thin clouds prevents thin clouds from completely obscuring the ground scene. In order to remove thin clouds in the cultivated land and restore the real ground information as much as possible, we proposed a cloud removal method of spatial information fusion self-attention generative adversarial network (SI-SA GAN) based on multi-directional perceptual attention and self-attention mechanism. The proposed method identifies and focuses on cloud regions by using the spatial attention, channel attention, and self-attention mechanism, which can enhance image information. The modules of the discriminator utilize residual networks and self-attention non-local neural networks to guide the output of image information. The generative adversarial network (GAN) is applied to remove clouds and restore the corresponding irregular occlusion area according to the depth characteristics of the input information, and a gradient penalty is applied to improve the robustness of the generative network. In this paper, we compared the evaluation indexes of other advanced models, the qualitative and quantitative results of Sentinel-2A datasets and public RICE datasets confirmed that the proposed method could enhance the image quality effectively after cloud removal, and the model has excellent thin cloud removal performance with small-scale training data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.