Abstract

Due to the inter-fraction variation of anatomy, it is highly desired to provide fast and low-dose volumetric imaging during prostate radiation therapy treatment for patient setup and daily treatment dose estimation. In this study, we propose a novel generative adversarial network integrated with perceptual supervision to derive 3D volumetric images from two orthogonal 2D projections. Our proposed network, named TransNet, consists of three modules, i.e., encoding, transformation and decoding modules. Rather than only using image distance loss between the generated 3D images and the ground truth 3D CT images to supervise the network, adversarial loss is used to improve the realism of generated 3D images. We conducted a study on 20 patient cases, who had received prostate radiotherapy in our institution, and evaluated the efficacy and consistency of our method for two orthogonal projection angles, i.e., 0° and 90°. For each 3D CT image, we simulated its 2D projections at these two angles. The TransNet takes the two angles as input and output the 3D CT. The mean absolute error (MAE), peak signal-to-noise ratio (PSNR) and structural similarity index metric (SSIM) achieved by our method are 117.5±15.3HU, 22.7±3.8dB and 0.904±0.27, respectively. These results demonstrate the feasibility and efficacy of our 2D-to-3D method for prostate cancer patients, which provides a potential solution for fast on-board volumetric imaging for patient setup and adaptive radiation therapy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.