Abstract
Background and ObjectiveHigh-resolution histopathology whole slide images (WSIs) contain abundant valuable information for cancer prognosis. However, most computational pathology methods for survival prediction have weak interpretability and cannot explain the decision-making processes reasonably. To address this issue, we propose a highly interpretable neural network termed pattern-perceptive survival transformer (Surformer) for cancer survival prediction from WSIs. MethodsNotably, Surformer can quantify specific histological patterns through bag-level labels without any patch/cell-level auxiliary information. Specifically, the proposed ratio-reserved cross-attention module (RRCA) generates global and local features with the learnable prototypes (pglobal, plocals) as detectors and quantifies the patches correlative to each plocal in the form of ratio factors (rfs). Afterward, multi-head self&cross-attention modules proceed with the computation for feature enhancement against noise. Eventually, the designed disentangling loss function guides multiple local features to focus on distinct patterns, thereby assisting rfs from RRCA in achieving more explicit histological feature quantification. ResultsExtensive experiments on five TCGA datasets illustrate that Surformer outperforms existing state-of-the-art methods. In addition, we highlight its interpretation by visualizing rfs distribution across high-risk and low-risk cohorts and retrieving and analyzing critical histological patterns contributing to the survival prediction. ConclusionsSurformer is expected to be exploited as a useful tool for performing histopathology image data-driven analysis and gaining new insights for interpreting the associations between such images and patient survival states.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.