Abstract
Computer-aided diagnosis (CAD) systems based on ultrasound have been developed and widely promoted in breast cancer screening. Due to the characteristics of low contrast and speckle noises, breast ultrasound image segmentation, one of the crucial steps of CAD systems, has always been challenging. Recently, the emerging Transformer-based medical segmentation methods, which have a better ability to model long dependencies than convolutional neural networks (CNNs), have shown significant value for medical image segmentation. However, due to the limited data with the high-quality label, Transformer performs weakly on breast ultrasound image segmentation without pretraining. Thus, we propose the Attention-Gate Medical Transformer (AGMT) for small breast ultrasound datasets, which introduces the attention-gate (AG) module to suppress background information and the average radial derivative increment (ΔARD) loss function to enhance shape information. We evaluate the AGMT on both a private dataset A and a public dataset B. On dataset A, the AGMT outperforms MT on the metrics of true positive ratio, jaccard index (JI) and dice similarity coefficient (DSC) by 6.4%, 2.3% and 1.9%, respectively. Meanwhile, when compared with UNet, the AGMT improves JI and DSC by 5.3% and 4.9%, respectively. The results show performance has significantly improved compared with mainstream models. In addition, we also conduct ablation experiments on the AG module and ΔARD, which prove their effectiveness.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.