Abstract
The aims of the work described here were to evaluate the learnability of thyroid nodule assessment on ultrasonography (US) using a big data set of US images and to evaluate the diagnostic utilities of artificial intelligence computer-aided diagnosis (AI-CAD) used by readers with varying experience to differentiate benign and malignant thyroid nodules. Six college freshmen independently studied the "learning set" composed of images of 13,560 thyroid nodules, and their diagnostic performance was evaluated after their daily learning sessions using the "test set" composed of images of 282 thyroid nodules. The diagnostic performance of two residents and an experienced radiologist was evaluated using the same "test set." After an initial diagnosis, all readers once again evaluated the "test set" with the assistance of AI-CAD. Diagnostic performance of almost all students increased after the learning program. Although the mean areas under the receiver operating characteristic curves (AUROCs) of residents and the experienced radiologist were significantly higher than those of students, the AUROCs of five of the six students did not differ significantly compared with that of the one resident. With the assistance of AI-CAD, sensitivity significantly increased in three students, specificity in one student, accuracy in four students and AUROC in four students. Diagnostic performance of the two residents and the experienced radiologist was better with the assistance of AI-CAD. A self-learning method using a big data set of US images has potential as an ancillary tool alongside traditional training methods. With the assistance of AI-CAD, the diagnostic performance of readers with varying experience in thyroid imaging could be further improved.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.