Abstract

Abstract Multiplexed immunofluorescence (mIF) microscopy reveals the spatial architecture of cancer tissue and its microenvironment that is not being fully explored with existing analysis methods. Most analysis approaches for mIF microscopy data focus on single-cell or region classification and rely on supervised machine learning. In this work, we present a self-supervised spatial profiling method for mIF images that takes local and global associations into account, and apply this method to profile cancer-associated fibroblasts (CAFs) in a pan-cancer dataset. We studied a 2-stage self-supervised training scheme to learn the representations of mIF tissue microarray (TMA) images in local and global scales (cellular and long-range associations). During the 1st self-supervised training stage, a Vision Transformer learns the local-scale representations by using small patches from TMA images. Then, the local representations are used as input to the 2nd stage where a similar self-supervised learning strategy is used to learn global patterns in the TMA images. We applied the method to profile multiple cohorts from three different solid tumors: prostate, renal, and lung cancer. In total, these cohorts include more than 5,000 TMA cores from over 1,750 patients extracted from the tumor center, tumor edge, and adjacent benign areas. The samples were stained with a CAF panel including FAP, aSMA, PDGFRB, pSTAT3/PDGFRA, nuclear and epithelial markers, and imaged with cyclic mIF microscopy. Samples were studied at the patch-level and TMA core-level. Small patches enable further analysis of associations in the local environment, whereas the core-level enables associations with patient clinical information. Clustering of patch-level (1st stage) and core-level (2nd stage) representations showed independently that self-supervised learning is capable of learning the representations of the mIF images. We were able to identify regions and cases with high pTNM staging from the prostate cancer samples and similar histological subtypes from renal cancer samples. We further validated the clustering using k-NN classification that showed high classification accuracy in all cohorts. Moreover, we developed a stopping criteria for the final model selection that balances between the similarities of samples and patches inside of samples to prevent overfitting. Our study shows that self-supervised learning enables unbiased discoveries from large-scale mIF microscopy imaging datasets. The developed method uncovers associations between imaging data and clinical information and highlights directly the patterns that are most meaningful for these associations. Citation Format: Gantugs Atarsaikhan, Isabel Mogollon, Katja Välimäki, Tuomas Mirtti, Teijo Pellinen, Lassi Paavolainen. Pan-cancer tumor microenvironment profiling with multiplexed immunofluorescence microscopy and self-supervised learning [abstract]. In: Proceedings of the American Association for Cancer Research Annual Meeting 2024; Part 1 (Regular Abstracts); 2024 Apr 5-10; San Diego, CA. Philadelphia (PA): AACR; Cancer Res 2024;84(6_Suppl):Abstract nr 892.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call