Abstract

Early detection and identification of malignant thyroid nodules, a vital precursory to the treatment, is a difficult task even for experienced clinicians. Many Computer-Aided Diagnose (CAD) systems have been developed to assist clinicians in performing this task on ultrasonic images. Learning-based CAD systems for thyroid nodules generally accommodate both nodule detection/ segmentation and fine-grained classification for its malignancy, and prior researches often treat aforementioned tasks in separate stages, leading to additional computational costs. In this paper, we utilize an online class activation mapping (CAM) mechanism to guide the network to learn discriminative features for identifying thyroid nodules in ultrasound images, called CAM attention network. It takes nodule masks as localization cues for direct spatial attention of the classification module, thereby avoiding isolated training for classification. Meanwhile, we propose a deformable convolution module to add offsets to the regular grid sampling locations in the standard convolution, guiding the network to capture more discriminative features of nodule areas. Furthermore, we use a generative adversarial network (GAN)to ensure reliable deformations of nodules from the deformable convolution module. Our proposed CAM attention network has already achieved the 2nd place in the classification task of TN-SCUI 2020, a MICCAI 2020 Challenge with the largest set of thyroid nodule ultrasound images according to our knowledge. The further inclusion of our proposed GAN-guided deformable module allows for capturing more fine-grained features between benign and malignant nodules, and further improves the classification accuracy to a new state-of-the-art level.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.