Abstract

ObjectiveTo obtain molecular information in slides directly from H&E staining slides, which apparently display morphological information, to show that some differences in molecular level have already encoded in morphology.MethodsIn this paper, we selected Ki-67-expression as the representative of molecular information. We proposed a method that can predict Ki-67 positive cells directly from H&E stained slides by a deep convolutional network model. To train this model, we constructed a dataset containing Ki-67 negative or positive cell images and background images. These images were all extracted from H&E stained WSIs and the Ki-67 expression was acquired from the corresponding IHC stained WSIs. The trained model was evaluated both on classification performance and the ability to quantify Ki-67 expression in H&E stained images.ResultsThe model achieved an average accuracy of 0.9371 in discrimination of Ki-67 negative cell images, positive cell images and background images. As for evaluation of quantification performance, the correlation coefficient between the quantification results of H&E stained images predicted by our model and that of IHC stained images obtained by color channel filtering is 0.80.Conclusion and SignificanceOur study indicates that the deep learning model has a good performance both on prediction of Ki-67 positive cells and quantification of Ki-67 expression in cancer samples stained by H&E. More generally, this study shows that deep learning is a powerful tool in exploring the relationship between morphological information and molecular information.Availability and ImplementationThe main program is available at https://github.com/liuyiqing2018/predict_Ki-67_from_HE

Highlights

  • In recent years, deep learning has developed rapidly and has outperformed humans in some medical data analysis tasks (Li et al, 2018; Norgeot et al, 2019; von Chamier et al, 2019)

  • We constructed a dataset containing Ki-67 negative or positive cell images and background images. These images were all extracted from hematoxylin and eosin (H&E) stained whole slide images (WSIs) and the Ki-67 expression was acquired from the corresponding IHC stained WSIs

  • As for evaluation of quantification performance, the correlation coefficient between the quantification results of H&E stained images predicted by our model and that of IHC stained images obtained by color channel filtering is 0.80

Read more

Summary

Introduction

Deep learning has developed rapidly and has outperformed humans in some medical data analysis tasks (Li et al, 2018; Norgeot et al, 2019; von Chamier et al, 2019). It is natural to come up with the idea about applying deep learning algorithms to these WSIs. many researched tasks have explored the potential of deep learning on histopathological image analysis (Komura and Ishikawa, 2018), such as detection or segmentation of Region of Interest (ROI) (Spanhol et al, 2016), scoring of immunostaining (Mungle et al, 2017), mitosis detection (Roux et al, 2013) and so on. Pathologists rely on H&E for their diagnosis and the majority of algorithms for histopathological image analysis, like cell detection, tissue segmentation and cancer grading, are based on H&E imaging (Ghaznavi et al, 2013). Molecular information like the expression of antigen (protein) in cells, which is more micro, is not reflected in H&E stained slides, which makes it difficult for pathologists and algorithms to analyze and assess

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.