Abstract

Background and purposeIn online adaptive magnetic resonance image (MRI)-guided radiotherapy (MRIgRT), manual contouring of rectal tumors on daily images is labor-intensive and time-consuming. Automation of this task is complex due to substantial variation in tumor shape and location between patients. The aim of this work was to investigate different approaches of propagating patient-specific prior information to the online adaptive treatment fractions to improve deep-learning based auto-segmentation of rectal tumors. Materials and methods243 T2-weighted MRI scans of 49 rectal cancer patients treated on the 1.5T MR-Linear accelerator (MR-Linac) were utilized to train models to segment rectal tumors. As benchmark, an MRI_only auto-segmentation model was trained. Three approaches of including a patient-specific prior were studied: 1. include the segmentations of fraction 1 as extra input channel for the auto-segmentation of subsequent fractions, 2. fine-tuning of the MRI_only model to fraction 1 (PSF_1) and 3. fine-tuning of the MRI_only model on all earlier fractions (PSF_cumulative). Auto-segmentations were compared to the manual segmentation using geometric similarity metrics. Clinical impact was assessed by evaluating post-treatment target coverage. ResultsAll patient-specific methods outperformed the MRI_only segmentation approach. Median 95th percentile Hausdorff (95HD) were 22.0 (range: 6.1–76.6) mm for MRI_only segmentation, 9.9 (range: 2.5–38.2) mm for MRI+prior segmentation, 6.4 (range: 2.4–17.8) mm for PSF_1 and 4.8 (range: 1.7–26.9) mm for PSF_cumulative. PSF_cumulative was found to be superior to PSF_1 from fraction 4 onward (p = 0.014). ConclusionPatient-specific fine-tuning of automatically segmented rectal tumors, using images and segmentations from all previous fractions, yields superior quality compared to other auto-segmentation approaches.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.