Abstract

Abstract For over 100 years, the traditional tools of pathology, such as tissue-marking dyes (e.g. the H&E stain) have been used to study the disorganization and dysfunction of cells within tissues. This has represented a principal diagnostic and prognostic tool in cancer. However, in the last 5 years, new technologies have promised to revolutionize histopathology, with Spatial Transcriptomics technologies allowing us to measure gene expression directly in pathology-stained tissue sections. In parallel with these developments, Artificial Intelligence (AI) applied to histopathology tissue images now approaches pathologist level performance in cell type identification. However, these new technologies still have severe limitations, with Spatial Transcriptomics suffering difficulties distinguishing transcriptionally similar cell types, and AI-based pathology tools often performing poorly on real world out-of-batch test datasets. Thus, century-old techniques still represent standard-of-care in most areas of clinical cancer diagnostics and prognostics. Here, we present a new frontier in digital pathology: describing a conceptually novel computational methodology, based on Bayesian probabilistic modelling, that allows Spatial Transcriptomics data to be leveraged together with the output of deep learning-based AI used to computationally annotate H&E-stained sections of the same tumor. By leveraging cell-type annotations from multiple independent pathologists, we show that this integrated methodology achieves better performance than any given pathologist’s manual tissue annotation in the task of identifying regions of immune cell infiltration in breast cancer, and easily outperforms either technology alone. We also show that on a subset of histopathology slides examined, the methodology can identify regions of clinically relevant immune cell infiltration that were missed entirely by an initial pathologist’s manual annotation. While this use case has clear diagnostic and prognostic value in cancer (e.g. predicting response to immunotherapy), our methodology is generalizable to any type of pathology images and also has broad applications in spatial transcriptomics data analytics, where most applications (such as identifying cell-cell interactions) rely on correct cell type annotations having been established a priori. We anticipate that this work will spur many follow-up studies, including new computational innovations building on the approach. The work sets the stage for better-than-pathologist performance in other cell-type annotation tasks, with relevant applications in diagnostics and prognostics across almost all cancers. Citation Format: Asif Zubair, Rich Chapple, Sivaraman Natarajan, William C. Wright, Min Pan, Hyeong-Min Lee, Heather Tillman, John Easton, Paul Geeleher. Jointly leveraging spatial transcriptomics and deep learning models for image annotation achieves better-than-pathologist performance in cell type identification in tumors [abstract]. In: Proceedings of the American Association for Cancer Research Annual Meeting 2022; 2022 Apr 8-13. Philadelphia (PA): AACR; Cancer Res 2022;82(12_Suppl):Abstract nr 456.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call