Abstract

Severe radiation-induced lymphopenia (RIL) in patients undergoing chemoradiotherapy (CRT) for non-small cell lung cancer (NSCLC) is associated with decreased immunotherapy efficacy and survival. At The Christie and MD Anderson Cancer Center (MDACC), prediction models for lymphopenia were developed in lung and esophageal cancer patients, respectively. The aim of this study was to externally validate both models in patients with stage III NSCLC. Patients who underwent concurrent CRT for stage III NSCLC in 2019-2021 were studied. Outcomes were grade ≥3 and grade 4 lymphopenia during CRT. The Christie model predictors for grade ≥3 lymphopenia included age, baseline lymphocyte count, radiotherapy duration, chemotherapy, mean heart and lung doses, and thoracic vertebrae V20Gy. MDACC predictors for grade 4 lymphopenia were age, baseline lymphocyte count, planning target volume (PTV), and BMI. The external performance of both models was assessed. Among 100 patients, 78 patients (78%) developed grade ≥3 lymphopenia, with grade 4 lymphopenia in 17 (17%). For predicting grade ≥3 lymphopenia, the Christie and MDACC models yielded c-statistics of 0.77 and 0.79, respectively. For predicting grade 4 lymphopenia, c-statistics were 0.69 and 0.80, respectively. Calibration for the Christie and MDACC models demonstrated moderate and good agreement, respectively. The PTV-based MDACC prediction model for severe RIL demonstrated superior external performance in NSCLC patients compared to the dosimetry-based Christie model. As such, the MDACC model can aid in identifying patients at high risk for severe lymphopenia. However, to optimize radiotherapy planning, further improvement and external validation of dosimetry-based models is desired.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call