Abstract

Breast cancer ranks among one of the most lethal cancer varieties among the many types that exist. Timely detection is paramount, as late diagnosis can exacerbate its severity. Computer-aided detection systems can complement the clinician in early decision-making. Therefore, in this study, a multi-level, complete convolution-driven attention-based transfer learning approach named ‘FCCS-Net’ has been proposed, for breast cancer classification. In contrast to the shared multi-layer perceptron (MLP)-based attention mechanism, the proposed approach employs a fully convolutional attention mechanism to focus the important cellular features in inter-channel and intra-channel feature space. This proposed attention is applied across multiple levels of a pre-trained ResNet18 model, supplemented with additional residual connections. The performance of the proposed FCCS-Net is tested on publicly available datasets such as ‘BreakHis’,‘IDC’ and ‘BACH’, containing breast cancer histopathology images. On the BreakHis dataset, the proposed method achieves accuracy rates of 99.25%, 98.32%, 99.50%, and 96.98% at 40X, 100X, 200X, and 400X optical zoom levels, respectively. In the case of the IDC dataset, a classification accuracy of 90.58% is attained at 40X magnifications, whereas with BACH dataset 91.25% average classification accuracy has been obtained. These findings establish the robustness and efficacy of the FCCS-Net in detecting breast cancer through histopathology images. The area focused by each attention layer has also been visually explained. The integration of multi-level, fully convolutional attention with supplementary residual connections holds the potential to advance breast cancer detection methodologies. The relevant PyTorch code for implementing the FCCS-Net model can be accessed at https://github.com/maurya123ritesh47/FCCS-Net.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call