Abstract

Calcification is an important criterion for classification between benign and malignant thyroid nodules. Deep learning provides an important means for automatic calcification recognition, but it is tedious to annotate pixel-level labels for calcifications with various morphologies. This study aims to improve accuracy of calcification recognition and prediction of its location, as well as to reduce the number of pixel-level labels in model training. We proposed a collaborative supervision network based on attention gating (CS-AGnet), which was composed of two branches: a segmentation network and a classification network. The reorganized two-stage collaborative semi-supervised model was trained under the supervision of all image-level labels and few pixel-level labels. The results show that although our semi-supervised network used only 30% (289 cases) of pixel-level labels for training, the accuracy of calcification recognition reaches 92.1%, which is very close to 92.9% of deep supervision with 100% (966 cases) pixel-level labels. The CS-AGnet enables to focus the model's attention on calcification objects. Thus, it achieves higher accuracy than other deep learning methods. Our collaborative semi-supervised model has a preferable performance in calcification recognition, and it reduces the number of manual annotations of pixel-level labels. Moreover, it may be of great reference for the object recognition of medical dataset with few labels.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.