Abstract

<div>Abstract<p>Prostate cancer harbors several genetic alterations, the most prevalent of which is <i>TMPRSS2:ERG</i> gene fusion, affecting nearly half of all cases. Capitalizing on the increasing availability of whole-slide images (WSI), this study introduces a deep learning (DL) model designed to detect <i>TMPRSS2:ERG</i> fusion from H&E-stained WSIs of radical prostatectomy specimens. Leveraging the TCGA prostate adenocarcinoma cohort, which comprises 436 WSIs from 393 patients, we developed a robust DL model, trained across 10 different splits, each consisting of distinct training, validation, and testing sets. The model's best performance achieved an AUC of 0.84 during training, and 0.72 on the TCGA test set. This model was subsequently validated on an independent cohort comprising 314 WSIs from a different institution, in which it has a robust performance at predicting <i>TMPRSS2:ERG</i> fusion with an AUC of 0.73. Importantly, the model identifies highly-attended tissue regions associated with <i>TMPRSS2:ERG</i> fusion, characterized by higher neoplastic cell content and altered immune and stromal profiles compared with fusion-negative cases. Multivariate survival analysis revealed that these morphologic features correlate with poorer survival outcomes, independent of Gleason grade and tumor stage. This study underscores the potential of DL in deducing genetic alterations from routine slides and identifying their underlying morphologic features, which might harbor prognostic information.</p>Implications:<p>Our study illuminates the potential of deep learning in effectively inferring key prostate cancer genetic alterations from the tissue morphology depicted in routinely available histology slides, offering a cost-effective method that could revolutionize diagnostic strategies in oncology.</p></div>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call