Abstract

Breast cancer diagnosis based on medical imaging necessitates both fine-grained lesion segmentation and disease grading. Although deep learning (DL) offers an emerging and powerful paradigm of feature learning for these two tasks, it is hampered from popularizing in practical application due to the lack of interpretability, generalization ability, and large labeled training sets. In this article, we propose a hierarchical fused model based on DL and fuzzy learning to overcome the drawbacks for pixelwise segmentation and disease grading of mammography breast images. The proposed system consists of a segmentation model (ResU-segNet) and a hierarchical fuzzy classifier (HFC) that is a fusion of interval type-2 possibilistic fuzzy c-means and fuzzy neural network. The ResU-segNet segments the masks of mass regions from the images through convolutional neural networks, while the HFC encodes the features from mass images and masks to obtain the disease grading through fuzzy representation and rule-based learning. Through the integration of feature extraction aided by domain knowledge and fuzzy learning, the system achieves favorable performance in a few-shot learning manner, and the deterioration of cross-dataset generalization ability is alleviated. In addition, the interpretability is further enhanced. The effectiveness of the proposed system is analyzed on the publicly available mammogram database of INbreast and a private database through cross-validation. Thorough comparative experiments are also conducted and demonstrated.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call