Abstract

Objective. In helical tomotherapy, image-guided radiotherapy employs megavoltage computed tomography (MVCT) for precise targeting. However, the high voltage of megavoltage radiation introduces substantial noise, significantly compromising MVCT image clarity. This study aims to enhance MVCT image quality using a deep learning-based denoising method. Approach. We propose an unpaired MVCT denoising network using a coupled generative adversarial network framework (DeCoGAN). Our approach assumes that a universal latent code within a shared latent space can reconstruct any given pair of images. By employing an encoder, we enforce this shared-latent space constraint, facilitating the conversion of low-quality (noisy) MVCT images into high-quality (denoised) counterparts. The network learns the joint distribution of images from both domains by leveraging samples from their respective marginal distributions, enhanced by adversarial training for effective denoising. Main Results. Compared to an analytical algorithm (BM3D) and three deep learning-based methods (RED-CNN, WGAN-VGG and CycleGAN), the proposed method excels in preserving image details and enhancing human visual perception by removing most noise and retaining structural features. Quantitative analysis demonstrates that our method achieves the highest peak signal-to-noise ratio and Structural Similarity Index Measurement values, indicating superior denoising performance. Significance. The proposed DeCoGAN method shows remarkable MVCT denoising performance, making it a promising tool in the field of radiation therapy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.