Abstract
It is meaningful for radiologists to segment thyroid nodules in ultrasound images quickly and accurately using an effective segmentation algorithm. With the rise of deep learning in computer vision, many deep learning-based methods have been proposed to assist radiologists in diagnosing thyroid diseases, such as thyroid nodule classification, detection and segmentation, but there exist few methods paying attention to malignant thyroid nodule segmentation. The goal of thyroid nodule segmentation is to identify the type of thyroid nodule. However, the identification of thyroid nodule type has been relatively well developed and the identification work almost can’t bother radiologists. The more important for radiologists is to detect the inconspicuous malignant nodules precisely in ultrasonic images, avoiding radiologists confusing tissues and malignant thyroid nodules during their diagnosis. This paper proposes a deep learning-based CAD (Computer-aided diagnosis) method called Dual-route Mirroring U-Net (DMU-Net) to segment malignant thyroid nodules automatically. The method uses two subnets (U-shape subnet, inversed U-shape subnet) and three modules (pyramid attention module (PAM), margin refinement module (MRM), aggregation module (AM)) to extract contextual information of thyroid nodules and margin details in ultrasonic images. Further, the strategy of mutual learning is introduced from the natural image classification task to enhance the performance of DMU-Net. We train and evaluate our method on the self-built Malignant Thyroid Nodule Segmentation (MTNS) dataset. Finally, we compare the DMU-Net with several classical deep learning-based methods on the MTNS dataset and other public datasets. The results show our DMU-Net can achieve superior performance on these datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.