Abstract

Advances in artificial intelligence technologies have made it possible to obtain more accurate and reliable results using digital images. Due to the advances in digital histopathological images obtained using whole slide image (WSI) scanners, automated analysis of digital images by computer support systems has become interesting. In particular, deep learning architectures, are one of the preferred approaches in the analysis of digital histopathology images. The deeper networks trained on large amounts of image data are adapted for different tasks using transfer learning technique. In this study, automated detection of invasive ductal carcinoma (IDC), which is the most common subtype of breast cancers, is proposed using deep transfer learning technique. We have used deep learning pre-trained models, ResNet-50 and DenseNet-161 for the IDC detection task. The public histopathology dataset containing 277,524 image patches were used in our experimental studies. As a result of training on the last layers of pre-trained deep networks, DenseNet-161 model has yielded F-sore of 92.38% and balanced accuracy value of 91.57%. Similarly, we have obtained F-score of 94.11% and balanced accuracy value of 90.96% using ResNet-50 architecture. In addition, our developed model is validated using the publicly available BreakHis breast cancer dataset and obtained promising results in classifying magnification independent histopathology images into benign and malignant classes. Our developed system obtained the highest classification performance as compared to the state-of-art techniques and is ready to be tested with more diverse huge databases.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.