Abstract

Breast cancer is the leading cause of death for women worldwide. Despite its difficulties, the histopathological diagnosis must be used because modern imaging techniques cannot differentiate between benign and malignant tumors. This article introduces a hybrid deep learning method with recursive feature elimination for categorizing histopathology images of breast lesions as benign or malignant, using standard datasets, to increase diagnostic accuracy. The proposed four-stage model includes several novel techniques. First, a modified Lipschitz-based image augmentation technique is utilized to ensure a proportionate representation of various sample classes. K-fold cross-validation is employed to mitigate the bias that emerges from the selection of training and testing datasets. Then, for efficient feature extraction, a hybrid convolutional neural network (CNN) approach is used by integrating three CNNs (PResNet-34, FE-VGG-16, and M-AlexNet). With the proposed universum twin support vector machine-based recursive feature elimination technique, the initial 1,000 features extracted by each deep model are further reduced to 100. Finally, the deep features are subsequently combined into a set of 300 features, which are then fed into machine learning classifiers such as decision tree, k-nearest neighborhood, linear discriminant analysis, linear regression, and support vector machine. The results demonstrate that the proposed hybrid CNN model outperforms state-of-the-art techniques in accurately classifying breast lesions as benign or malignant, with an accuracy of 99.99 %, sensitivity of 99.56 %, specificity of 99.53 %, precision of 99.55 %, and F1 score of 99.77 %.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.