Abstract

The successful recognition of benign and malignant breast nodules using ultrasound images is based mainly on supervised learning that requires a large number of labeled images. However, because high-quality labeling is expensive and time-consuming, we hypothesized that semi-supervised learning could provide a low-cost and powerful alternative approach. This study aimed to develop an accurate semi-supervised recognition method and compared its performance with supervised methods and sonographers. The faster region-based convolutional neural network was used for nodule detection from ultrasound images. A semi-supervised classifier based on the mean teacher model was proposed to recognize benign and malignant nodule images. The general performance of the proposed method on two datasets (8,966 nodules) was reported. The detection accuracy was 0.88±0.03 and 0.86±0.02, respectively, on two testing sets (1,350 and 2,220 nodules). When 800 labeled training nodules were available, the proposed semi-supervised model plus 4,396 unlabeled nodules performed better than the supervised learning model (area under the curve (AUC): 0.934±0.026 vs. 0.83±0.050; 0.916±0.022 vs. 0.815±0.049). The performance of the semi-supervised model trained on 800 labeled and 4,396 unlabeled nodules was close to that of the supervised learning model trained on a massive number of labeled nodules (n=5,196) (AUC: 0.934±0.026 vs. 0.952±0.027; 0.916±0.022 vs. 0.918±0.017). Moreover, the semi-supervised model was better than the average accuracy of five human sonographers (AUC: 0.922 vs. 0.889). The semi-supervised model can achieve excellent performance for nodule recognition and be useful for medical sciences. The method reduced the number of labeled images required for training, thus significantly alleviating the difficulty in data preparation of medical artificial intelligence.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.