Abstract

Histopathology images are very distinctive, and one image may contain thousands of objects. Transferring features from natural images to histopathology images may not provide impressive outcomes. In this study, we have proposed a novel modality-specific CBAM-VGGNet model for classifying H and E stained breast histopathology images. Instead of using pre-trained models on ImageNet, we have trained VGG16 and VGG19 models on the same domain cancerous histopathology datasets which are then used as fixed feature extractors. We have added GAP layer and Convolutional block attention module (CBAM) after the first convolutional layer of convolutional blocks. CBAM is an effective module for neural networks to focus on relevant features. We have implemented the VGG16 and VGG19 in a novel way following the configuration of state-of-the-art models with our own concatenated layers. The addition of the GAP layer in VGGNet has reduced the number of parameters and hence requiring less computational power. Both models are ensembled using the averaging ensemble technique. Features are extracted from the final ensembled model then passed to the feed-forward neural network. A hybrid pre-processing technique is proposed that first use median filter and afterwards contrast limited adaptive histogram equalization (CLAHE). Median filter is utilized to remove the highly significant noise and is directly related to image quality. CLAHE improve the local contrast present in an image and boost the weak boundary edges in each image pixel. The proposed CBAM ensemble model has outperformed state-of-the-art models with an accuracy of 98.96% and 97.95% F1-score on 400X data of BreakHis dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call