Abstract

Ultrasonography images of breast mass aid in the detection and diagnosis of breast cancer. Manually analyzing ultrasonography images is time-consuming, exhausting and subjective. Automated analyzing such images is desired. In this study, we develop an automated breast cancer diagnosis model for ultrasonography images. Traditional methods of automated ultrasonography images analysis employ hand-crafted features to classify images, and lack robustness to the variation in the shapes, size and texture of breast lesions, leading to low sensitivity in clinical applications. To overcome these shortcomings, we propose a method to diagnose breast ultrasonography images using deep convolutional neural networks with multi-scale kernels and skip connections. Our method consists of two components: the first one is to determine whether there are malignant tumors in the image, and the second one is to recognize solid nodules. In order to let the two networks work in a collaborative way, a region enhance mechanism based on class activation maps is proposed. The mechanism helps to improve classification accuracy and sensitivity for both networks. A cross training algorithm is introduced to train the networks. We construct a large annotated dataset containing a total of 8145 breast ultrasonography images to train and evaluate the models. All of the annotations are proven by pathological records. The proposed method is compared with two state-of-the-art approaches, and outperforms both of them by a large margin. Experimental results show that our approach achieves a performance comparable to human sonographers and can be applied to clinical scenarios.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.