Abstract

Objective. Computed tomography (CT) is a widely employed imaging technology for disease detection. However, CT images often suffer from ring artifacts, which may result from hardware defects and other factors. These artifacts compromise image quality and impede diagnosis. To address this challenge, we propose a novel method based on dual contrast learning image style transformation network model (DCLGAN) that effectively eliminates ring artifacts from CT images while preserving texture details. Approach. Our method involves simulating ring artifacts on real CT data to generate the uncorrected CT (uCT) data and transforming them into strip artifacts. Subsequently, the DCLGAN synthetic network is applied in the polar coordinate system to remove the strip artifacts and generate a synthetic CT (sCT). We compare the uCT and sCT images to obtain a residual image, which is then filtered to extract the strip artifacts. An inverse polar transformation is performed to obtain the ring artifacts, which are subtracted from the original CT image to produce a corrected image. Main results. To validate the effectiveness of our approach, we tested it using real CT data, simulated data, and cone beam computed tomography images of the patient’s brain. The corrected CT images showed a reduction in mean absolute error by 12.36 Hounsfield units (HU), a decrease in root mean square error by 18.94 HU, an increase in peak signal-to-noise ratio by 3.53 decibels (dB), and an improvement in structural similarity index by 9.24%. Significance. These results demonstrate the efficacy of our method in eliminating ring artifacts and preserving image details, making it a valuable tool for CT imaging.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.