Abstract

ABSTRACTThyroid nodules are a common endocrine system disorder for which accurate ultrasound image segmentation is important for evaluation and diagnosis, as well as a critical step in computer‐aided diagnostic systems. However, the accuracy and consistency of segmentation remains a challenging task due to the presence of scattering noise, low contrast and resolution in ultrasound images. Therefore, we propose a deep learning‐based CAD (computer‐aided diagnosis) method, STU3Net in this paper, aiming at automatic segmentation of thyroid nodules. The method employs a modified Swin Transformer combined with a CNN encoder, which is capable of extracting morphological features and edge details of thyroid nodules in ultrasound images. In decoding through the features for image reconstruction, we introduce a modified three‐layer U‐Net network with cross‐layer connectivity to further enhance image reduction. This cross‐layer connectivity enhances the network's capture and representation of the contained image feature information by creating skip connections between different layers and merging the detailed information of the shallow network with the abstract information of the deeper network. Through comparison experiments with current mainstream deep learning methods on the TN3K and BUSI datasets, we validate the superiority of the STU3Net method in thyroid nodule segmentation performance. The experimental results show that STU3Net outperforms most of the mainstream models on the TN3K dataset, with Dice and IoU reaching 0.8368 and 0.7416, respectively, which are significantly better than other methods. The method demonstrates excellent performance on these datasets and provides radiologists with an effective auxiliary tool to accurately detect thyroid nodules in ultrasound images.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.