Abstract

This study retrospectively analyzed the performance of artificial neural networks (ANN) to predict overall survival (OS) or locoregional failure (LRF) in HNSCC patients undergoing radiotherapy, based on 2-[18F]FDG PET/CT and clinical covariates. We compared predictions relying on three different sets of features, extracted from 230 patients. Specifically, (i) an automated feature selection method independent of expert rating was compared with (ii) clinical variables with proven influence on OS or LRF and (iii) clinical data plus expert-selected SUV metrics. The three sets were given as input to an artificial neural network for outcome prediction, evaluated by Harrell’s concordance index (HCI) and by testing stratification capability. For OS and LRF, the best performance was achieved with expert-based PET-features (0.71 HCI) and clinical variables (0.70 HCI), respectively. For OS stratification, all three feature sets were significant, whereas for LRF only expert-based PET-features successfully classified low vs. high-risk patients. Based on 2-[18F]FDG PET/CT features, stratification into risk groups using ANN for OS and LRF is possible. Differences in the results for different feature sets confirm the relevance of feature selection, and the key importance of expert knowledge vs. automated selection.

Highlights

  • Advances in radiation oncology and medical imaging are closely linked, as attested by the widespread use of image guided radiotherapy (IGRT) and, more recently, the successful implementation of MR guided radiotherapy [1,2]

  • This development has the potential to translate into clinical practice by guiding treatment decisions and therapy planning, especially in an imaging driven field such as radiation oncology [5], where for instance several studies [6,7,8] showcased how deep learning could be used for head and neck cancer outcome prediction based on pretreatment computed tomography (CT)

  • We investigated whether a neural network-based algorithm applied on PET features along with clinical data can provide prognostic information for head and neck cancer patients undergoing curative radiotherapy in terms of locoregional failure (LRF) and overall survival (OS)

Read more

Summary

Introduction

Advances in radiation oncology and medical imaging are closely linked, as attested by the widespread use of image guided radiotherapy (IGRT) and, more recently, the successful implementation of MR guided radiotherapy [1,2]. Using artificial intelligence algorithms and neural networks for machine learning allows for processing large amounts of data for predictive model building [4] This development has the potential to translate into clinical practice by guiding treatment decisions and therapy planning, especially in an imaging driven field such as radiation oncology [5], where for instance several studies [6,7,8] showcased how deep learning could be used for head and neck cancer outcome prediction based on pretreatment CTs. Imaging with PET/CT using 2-deoxy-2-[18F]fluoro-D-glucose (2-[18F]FDG) is routinely part of pretreatment workup in several tumor entities such as squamous cell carcinoma of the head and neck (HNSCC) and allows extraction of several features. In PET images, volumes of interests (VOIs) can be defined semi-automatically based on the tracer-uptake, eliminating the need to manually define VOIs for radiomics evaluation, allowing for high-throughput user-independent evaluation

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call