Abstract

Computer-aided diagnosis (CAD) systems based on ultrasound have been developed and widely promoted in breast cancer screening. Due to the characteristics of low contrast and speckle noises, breast ultrasound image segmentation, one of the crucial steps of CAD systems, has always been challenging. Recently, the emerging Transformer-based medical segmentation methods, which have a better ability to model long dependencies than convolutional neural networks (CNNs), have shown significant value for medical image segmentation. However, due to the limited data with the high-quality label, Transformer performs weakly on breast ultrasound image segmentation without pretraining. Thus, we propose the Attention-Gate Medical Transformer (AGMT) for small breast ultrasound datasets, which introduces the attention-gate (AG) module to suppress background information and the average radial derivative increment (ΔARD) loss function to enhance shape information. We evaluate the AGMT on both a private dataset A and a public dataset B. On dataset A, the AGMT outperforms MT on the metrics of true positive ratio, jaccard index (JI) and dice similarity coefficient (DSC) by 6.4%, 2.3% and 1.9%, respectively. Meanwhile, when compared with UNet, the AGMT improves JI and DSC by 5.3% and 4.9%, respectively. The results show performance has significantly improved compared with mainstream models. In addition, we also conduct ablation experiments on the AG module and ΔARD, which prove their effectiveness.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call