AbstractGiven that clouds can absorb and scatter radiation signals in the visible and infrared bands, cloud detection is a key preprocessing step for ocean color and sea surface temperature retrievals. In this research, a Spectral-and-Textural-Information-Guided deep neural Network (STIGNet) is designed for cloud detection in global ocean data from the Haiyang-1C (HY-1C)/Chinese Ocean Color and Temperature Scanner (COCTS). Considering the spectral and textural properties of clouds, the model incorporates HY-1C/COCTS spectral data, differences in brightness temperature (BT), local statistical characteristics of BT, and geographical location information–all of which are closely related to cloud features. Notably, an edge learning module is implemented to emphasize edge features during the training process. We construct a HY-1C/COCTS cloud detection dataset to train and test the cloud detection model. In the dataset, labels are generated by combining the Bayesian cloud detection method with a manual mask. Analysis of the resulting cloud detection images indicates that STIGNet exhibits accurate performance across various types of clouds while showing minimal overestimated errors in areas such as ocean fronts or sun glints, where they tend to occur frequently. The ablation experiments performed on physical-based input features and edge learning modules show enhancements in cloud detection accuracy. Evaluation results demonstrate an overall accuracy of 96.64%, with a cloud overestimated error of 1.61% and a cloud missed error of 1.76%. These findings highlight the effectiveness of STIGNet in generating precise cloud masks for HY-1C/COCTS data.
Read full abstract