Abstract
Image-guided radiotherapy (IGRT) allows observation of the location and shape of the tumor and organs-at-risk (OAR) over the course of a radiation cancer treatment. Such information may in turn be used for reducing geometric uncertainties during therapeutic planning, dose delivery and response assessment. However, given the multiple imaging modalities and/or contrasts potentially included within the imaging protocol over the course of the treatment, the current manual approach to determining tissue displacement may become time-consuming and error prone. In this context, variational multi-modal deformable image registration (DIR) algorithms allow automatic estimation of tumor and OAR deformations across the acquired images. In addition, they require short computational times and a low number of input parameters, which is particularly beneficial for online adaptive applications, which require on-the-fly adaptions with the patient on the treatment table.However, the majority of such DIR algorithms assume that all structures across the entire field-of-view (FOV) undergo a similar deformation pattern. Given that various anatomical structures may behave considerably different, this may lead to the estimation of anatomically implausible deformations at some locations, thus limiting their validity. Therefore, in this paper we propose an anatomically-adaptive variational multi-modal DIR algorithm, which employs a regionalized registration model in accordance with the local underlying anatomy. The algorithm was compared against two existing methods which employ global assumptions on the estimated deformations patterns.Compared to the existing approaches, the proposed method has demonstrated an improved anatomical plausibility of the estimated deformations over the entire FOV as well as displaying overall higher accuracy. Moreover, despite the more complex registration model, the proposed approach is very fast and thus suitable for online scenarios. Therefore, future adaptive IGRT workflows may benefit from an anatomically-adaptive registration model for precise contour propagation and dose accumulation, in areas showcasing considerable variations in anatomical properties.
Highlights
One of the major challenges during external-beam radiotherapy (EBRT) is addressing the geometrical uncertainties introduced by the changes in shape and location of the tumor and the organs-at-risk (OARs) over the course of the treatment (Roach et al 2011)
CT/CBCT-based tracking in lung cancer patients Figure 1 illustrates the statistical distribution of the Dice Similarity Coefficient (DSC) following the registration of the CT-CT and CT-CBCT datasets
While all three algorithms lead to improvements of the post-registration dice similarity coefficient (DSC), there is a noticeable tendency of the EVI algorithm to under-perform for the lungs, compared to both EVO and AEVO
Summary
One of the major challenges during external-beam radiotherapy (EBRT) is addressing the geometrical uncertainties introduced by the changes in shape and location of the tumor and the organs-at-risk (OARs) over the course of the treatment (Roach et al 2011). A feasible solution for automatic tracking of organ and pathological tissue boundaries over the course of the treatment is multi-modal deformable image registration (DIR) (Hill et al 2001, Mani & Arivazhagan 2013, Sotiras et al 2013). Such methods have the capability to estimate voxel-wise deformations across images acquired either with the same or a different modality and/or contrast. Despite the more complex nature of an anatomically adaptive approach, the numerical implementation has been optimized such that the performance remains entirely compatible with the requirements of online IGRT
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.