Abstract

China has six observing stations, providing over 52,000 handwritten sunspot drawings from 1947–2016. The observing stations are the Purple Mountain Astronomical Observatory (PMO), Yunnan Astronomical Observatory (YNAO), Qingdao Observatory Station (QDOS), Sheshan Observatory Station (SSOS), Beijing Planetarium (BJP), and Nanjing University (NJU). In this paper, we propose a new cotraining semisupervised learning method combining a semantic segmentation method named dynamic mutual training (DMT) boundary-guided semantic segmentation (BGSeg), i.e., DMT_BGSeg, which makes full use of the labeled data from PMO and the unlabeled data from the other five stations to detect and segment sunspot components in all sunspot drawings of the six Chinese stations. The sunspot is detected and segmented. Additionally, each sunspot is split into four types of components: pore, spot, umbra, and hole. The testing results show the mIoU values of PMO, YNAO, BJP, NJU, QDOS and SSOS are 85.29, 72.65, 73.82, 64.28, 62.26, and 60.07, respectively. The results of the comparison also show that DMT_BGSeg is effective in detecting and segmenting sunspots in Chinese sunspot drawings. The numbers and areas of sunspot components are measured separately. All of the detailed data are publicly shared on China-VO, which will advance the comprehensive augmentation of the global historical sunspot database and further the understanding of the long-term solar activity cycle and solar dynamo.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.