Abstract
Abstract Ground-based solar telescopes often encounter cloud interference during observations, leading to varying degrees of cloud cover in solar images. Existing cloud removal methods face several challenges, including incomplete cloud removal, insufficient restoration of solar features, and ineffectiveness against severe cloud cover. This paper proposes a novel deep learning-based approach for cloud removal and feature restoration in Hα full-disk solar images. First, to create a high-quality data set of cloud-covered images, we developed a method for detecting and classifying cloudiness in Hα full-disk solar images. This method correlates average intensity profiles in log-polar coordinates and utilizes the median pixel intensity in the images. Using this approach, we constructed a data set with three levels of cloudiness: mild, moderate, and severe, utilizing high-cadence Hα full-disk solar images from different sites within the Global Oscillation Network Group. Second, our cloud removal network employs an encoder–decoder structure, integrating the Res2Net module into the encoder to capture comprehensive solar image features. The decoder enhances the model's ability to extract semantic information about cloud-covered areas through a residual multiscale attention mechanism. Quantitative and qualitative experimental results demonstrate that the proposed method effectively removes complex cloud interference while preserving solar features, significantly improving the quality of observed data and outperforming existing solar image cloud removal methods. Additionally, the model serves as a valuable reference for removing clouds from other ground-based telescope data. The data and code are available on GitHub (https://github.com/dupeng24/full-disk-cloud-removal) under a CC0 1.0 License, and version 2.0 is archived in Zenodo (10.5281/zenodo.14184221).
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have