Abstract

Breast cancer is the most common cause of female morbidity and death worldwide. Compared with other cancers, early detection of breast cancer is more helpful to improve the prognosis of patients. In order to achieve early diagnosis and treatment, clinical treatment requires rapid and accurate diagnosis. Therefore, the development of an automatic detection system for breast cancer suitable for patient imaging is of great significance for assisting clinical treatment. Accurate classification of pathological images plays a key role in computer-aided medical diagnosis and prognosis. However, in the automatic recognition and classification methods of breast cancer pathological images, the scale information, the loss of image information caused by insufficient feature fusion, and the enormous structure of the model may lead to inaccurate or inefficient classification. To minimize the impact, we proposed a lightweight PCSAM-ResCBAM model based on two-stage convolutional neural network. The model included a Parallel Convolution Scale Attention Module network (PCSAM-Net) and a Residual Convolutional Block Attention Module network (ResCBAM-Net). The first-level convolutional network was built through a 4-layer PCSAM module to achieve prediction and classification of patches extracted from images. To optimize the network's ability to represent global features of images, we proposed a tiled feature fusion method to fuse patch features from the same image, and proposed a residual convolutional attention module. Based on the above, the second-level convolutional network was constructed to achieve predictive classification of images. We evaluated the performance of our proposed model on the ICIAR2018 dataset and the BreakHis dataset, respectively. Furthermore, through model ablation studies, we found that scale attention and dilated convolution play an important role in improving model performance. Our proposed model outperforms the existing state-of-the-art models on 200 × and 400 × magnification datasets with a maximum accuracy of 98.74 %.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.