Abstract
Breast cancer is one of the leading causes of death among the women worldwide. The clinical medical system urgently needs an accurate and automatic breast segmentation method in order to detect the breast ultrasound lesions. Recently, some studies show that deep learning methods based on fully convolutional network, have demonstrated a competitive performance in breast ultrasound segmentation. However, some features are missed in the Unet in case of down-sampling that results in a low segmentation accuracy. Furthermore, there is a semantic gap between the feature maps of decoder and encoder in Unet, so the simple fusion of high and low level features is not conducive to the semantic classification of pixels. In addition, the poor quality of breast ultrasound also affects the accuracy of image segmentation. To solve these problems, we propose a new end-toend network model called Dense skip Unet (DsUnet), which consists of the Unet backbone, short skip connection and deep supervision. The proposed method can effectively avoid the missing of feature information caused by down-sampling and implement the fusion of multilevel semantic information. We used a new loss function to optimize the DsUnet, which is composed of a binary cross-entropy and dice coefficient. We employed the True Positive Fraction (TPF), False Positives per image (FPs) and F -measure as performance metrics for evaluating various methods. In this paper, we adopted the UDIAT 212 dataset and the experimental results validate that our new approach achieved better performance than other existing methods in detecting and segmenting the ultrasound breast lesions. When we used the DsUnet model and new loss function (binary cross-entropy + dice coefficient), the best performance indexes can be achieved, i.e., 0.87 in TPF, 0.13 in FPs/image and 0.86 in F-measure.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.