Abstract

Breast ultrasound images segmentation is one of the key steps in clinical auxiliary diagnosis of breast cancer, which seriously threatens women’s health. Currently, deep learning methods have been successfully applied to breast tumors segmentation. However, blurred boundaries, heterostructure and other factors can cause serious missed detections and false detections in the segmentation results. In this paper, we developed a novel refinement residual convolutional network to segment breast tumors accurately from ultrasound images, which mainly composed of SegNet with deep supervision module, missed detection residual network and false detection residual network. In SegNet, we add six side-out deep supervision modules to guide the network to learn to predict precise segmentation masks scale-by-scale. In missed detection residual network, the receptive field provided by different dilation rates can provide more global information, which is easily lost in deep convolutional layer. The introduction of false detection and missed detection residual network can promotes the network to make more efforts on those hardly-predicted pixels to help us obtain more accurate segmentation results of the breast tumor. To evaluate the segmentation performance of the network, we compared with several state-of-the-art segmentation approaches using five quantitative metrics on two public breast datasets. Experimental results demonstrate that our method achieves the best segmentation results, which indicates that our method has better adaptability on breast tumors segmentation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.