Abstract

Breast cancer is one of the most common cancers occurring in women and is the main cause of mortality among women aged 20 to 59. Detection of breast cancer in the early stage increases survival rate to 80%, and the rate decreases to around 40% if diagnosed in its later stage. There exists significant need to diagnose breast cancer in its early stage itself among women suspected with the disease. The hematoxylin and eosin (H&E)–stained histopathology tissue samples taken by pathologists from suspected breast areas are viewed with whole slide imaging (WSI) and can be stored as digital histopathology images. Manual analysis of histopathology images is time-consuming and leads to inter- and intraobserver variability. Hence, automated classification methods are in research focus. Breast tumors are classified as benign and malignant tumors, which are non-cancerous and cancerous tumors, respectively. Currently, deep learning through convolutional neural networks is an emerging technology and has proven to be successful for image segmentation and classification applications. These deep CNNs require large labeled datasets, which are lacking in the field of medicine, especially for breast cancer images. Also, high computational power, time, and memory are required if CNN is to be trained from scratch. To overcome this, transfer learning from natural images like ImageNet, comprising of 1.2 million natural images belonging to 1,000 classes, is used. Pretrained models like VGG16, ResNet50, Inception, AlexNet, and DenseNet are employed for transfer learning, and fine-tuning is to be performed for refinement. In the proposed method, four different network architectures, namely, AlexNet, VGG16, Inception v3, and DenseNet121, are used for classification of histopathological images using transfer learning. The top dense layers of these architectures are removed, and custom dense layers are added for classification. Also, the weights of the last two layers are fine-tuned with the histopathology input images dataset. The publicly available breast cancer hisopathology images dataset BreakHis is used for validation of the proposed method. The experimental results prove that transfer learning–based CNN gives better classification accuracy compared to networks learned from scratch. Also, it is inferred from the experimental results that the pretrained model of DenseNet121 gives the highest classification accuracy of 0.9520 compared to the other models and state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call