Abstract

Mass segmentation plays an important role in the qualitative and quantitative analysis of 3D automated breast ultrasound (ABUS) volumes. However, accurate 3D mass segmentation is a challenging task due to the low signal to noise ratio, large tumor size variations, and the serious class imbalance between foreground and background in ABUS volumes. In this paper, we present a deep learning-based 3D ABUS tumor segmentation method mainly to solve the class imbalance problem. A 3D Residual U-Net is designed to effectively explore feature representations during training. In order to solve the class imbalance problem and make a trade-off between false positive and false negative predictions, a boundary loss with a signed non-Euclidean distance map is introduced. The proposed method is trained and evaluated on 83 ABUS volumes. Experimental results show that the performance of the proposed method is better than existing methods, with a Dice similarity coefficient (DSC) of 0.82 ± 0.08 on the testing set, indicating the effectiveness of the proposed method on 3D ABUS mass segmentation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call