Abstract

The surgical pathology workflow currently adopted by clinics uses staining to reveal tissue architecture within thin sections. A trained pathologist then conducts a visual examination of these slices and, since the investigation is based on an empirical assessment, a certain amount of subjectivity is unavoidable. Furthermore, the reliance on external contrast agents such as hematoxylin and eosin (H&E), albeit being well-established methods, makes it difficult to standardize color balance, staining strength, and imaging conditions, hindering automated computational analysis. In response to these challenges, we applied spatial light interference microscopy (SLIM), a label-free method that generates contrast based on intrinsic tissue refractive index signatures. Thus, we reduce human bias and make imaging data comparable across instruments and clinics. We applied a mask R-CNN deep learning algorithm to the SLIM data to achieve an automated colorectal cancer screening procedure, i.e., classifying normal vs. cancerous specimens. Our results, obtained on a tissue microarray consisting of specimens from 132 patients, resulted in 91% accuracy for gland detection, 99.71% accuracy in gland-level classification, and 97% accuracy in core-level classification. A SLIM tissue scanner accompanied by an application-specific deep learning algorithm may become a valuable clinical tool, enabling faster and more accurate assessments by pathologists.

Highlights

  • From benign adenomatous polyps to carcinoma, colorectal cancer develops through a series of genetic mutations over a 5–10 year course [1]

  • artificial intelligence (AI) can recognize image patterns that might be too subtle for human eyes, and significantly improves clinical cancer screening and diagnosis

  • We have recently demonstrated that the combination of spatial light interference microscopy (SLIM) and AI can screen colorectal tissue as cancerous vs. benign [28]

Read more

Summary

Introduction

From benign adenomatous polyps to carcinoma, colorectal cancer develops through a series of genetic mutations over a 5–10 year course [1]. AI can recognize image patterns that might be too subtle for human eyes, and significantly improves clinical cancer screening and diagnosis In this scientific area, we have recently demonstrated that the combination of SLIM and AI can screen colorectal tissue as cancerous vs benign [28]. In order to advance our screening method and provide completely automatic classification, here we used a SLIM-based whole slide imaging tissue scanner together with a mask R-CNN deep learning network, to both segment the glands and classify cancer and benign tissues. This new method of automatic colorectal cancer screening uses intrinsic tissue markers, and is of potential clinical value. We eliminated the need for manual segmentation as a pre-requisite of the procedure

Label-Free Tissue Scanner
Tissue Imaging
Deep Learning Network
Whole Core Segmentation and Classification
Overall and Detection
Overall Core Classification
Detection Performance at Three Different Confidence Scores
Accuracy Reports in Classification, Detecting and Diagnosis
Detection Performance at Three Different Epochs
Discussion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call