Abstract

Abstract Additional prognostic stratification of colorectal cancer patients is needed to improve management of patients. Visual microscopic assessment of tumor samples remains the standard method for disease subtyping. However, visual analysis of samples is subjective and weakly reproducible due to inter- and intra-observer variations. Recent progress within machine learning, especially its novel branch called deep learning, enables accurate evaluation of complex patterns observed in microscopic tissue images. Here, we combined deep learning techniques to evaluate a set of digitized formalin fixed paraffin embedded hematoxylin-eosin stained tumor tissue microarray (TMA) samples from 420 randomly selected patients with colorectal cancer. For each patient a set of clinicopathological characteristics including histological grade, Dukes stage and age at diagnosis are available as well as outcome data. Using convolutional neural networks and Long Short-Term Memory networks we validated the predictive power of the colorectal TMAs with regards to patient outcome. Univariate Cox proportional hazard regression analysis demonstrated that the prognostic accuracy of the deep learning algorithm on TMAs (hazard ratio 2.3; CI 95% 1.79-3.03) outperforms visual histological grading performed by a certified pathologist on a whole slide level (hazard ratio 1.65; CI 95% 1.30-2.15). In multivariate Cox proportional hazard regression, the deep learning based model was a prognostic factor, independent of histological grade, Dukes stage and age at diagnosis (Wald p-value < 0.001). Thus, we demonstrate that novel deep learning models can serve as digital prognostic biomarkers in colorectal cancer. Citation Format: Dmitrii Bychkov, Riku Turkki, Caj Haglund, Nina Linder, Johan Lundin. Outcome prediction in colorectal cancer using digitized tumor samples and machine learning [abstract]. In: Proceedings of the American Association for Cancer Research Annual Meeting 2017; 2017 Apr 1-5; Washington, DC. Philadelphia (PA): AACR; Cancer Res 2017;77(13 Suppl):Abstract nr 5718. doi:10.1158/1538-7445.AM2017-5718

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.