Abstract

Deep learning methods, especially convolutional neural networks, have advanced the breast lesion classification task using breast ultrasound (BUS) images. However, constructing a highly-accurate classification model still remains challenging due to complex pattern, relatively-low contrast and fuzzy boundary existing between lesion regions (i.e., foreground) and the surrounding tissues (i.e., background). Few studies have separated foreground and background for learning domain-specific representations, and then fused them for improving performance of models. In this paper, we propose a saliency map-guided hierarchical dense feature aggregation framework for breast lesion classification using BUS images. Specifically, we first generate saliency maps for foreground and background via super-pixel clustering and multi-scale region grouping. Then, a triple-branch network, including two feature extraction branches and a feature aggregation branch, is constructed to learn and fuse discriminative representations under the guidance of priors provided by saliency maps. In particular, two feature extraction branches take the original image and corresponding saliency map as input for extracting foreground- and background-specific representations. Subsequently, a hierarchical feature aggregation branch receives and fuses the features from different stages of two feature extraction branches, for lesion classification in a task-oriented manner. The proposed model was evaluated on three datasets using 5-fold cross validation, and experimental results have demonstrated that it outperforms several state-of-the-art deep learning methods on breast lesion diagnosis using BUS images.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.