Abstract
Contrast-enhanced computed tomography (CECT) has been commonly used in clinical practice of radiotherapy for enhanced tumor and organs at risk (OARs) delineation since it provides additional visualization of soft tissue and vessel anatomy. However, the additional CECT scan leads to increased radiation dose, prolonged scan time, risks of contrast induced nephropathy (CIN), potential requirement of image registration to non-contrast simulation CT, as well as elevated cost, etc. Hypothesizing that the non-contrast simulation CT contains sufficient features to differentiate blood and other soft tissues, in this study, we propose a novel deep learning-based method for generation of CECT images from non-contrast CT. The method exploits a cycle-consistent generative adversarial network (CycleGAN) framework to learn a mapping from non-contrast CT to CECT. A residual U-Net was employed as the generator of the CycleGAN to force the model to learn the specific difference between the non-contrast CT and CECT. The proposed algorithm was evaluated with 20 sets of abdomen patient data with a manor of five-fold cross validation. Each patient was scanned at the same position with non-contrast simulation CT and CECT. The CECT images were treated as training target in training and ground truth in testing. The non-contrast simulation CT served as the input. The preliminary results of visual and quantitative inspections suggest that the proposed method could effectively generate CECT images from non-contrast CT. This method could improve anatomy definition and contouring in radiotherapy without additional clinic efforts in CECT scanning.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.