Abstract

Computed Tomography (CT) imaging is one of the most widely-used and cost-effective technology for organ screening and diseases diagnosis. Because of existence of metallic implants in some patients, the CT images acquired from these patients are often corrupted by undesirable metal artifacts, which causes severe problem of metal artifact. Although there have been proposed many methods to reduce metal artifact, reduction is still challenging and inadequate, and results are suffering from symptom variance, second artifact and poor subjective evaluation. To address these problems, we propose a novel metal artifact reduction method based on generative adversarial networks to simultaneously reduce metal artifacts and enhance texture structure of corrected CT images. Specifically, we firstly incorporate interactive information (text) and imaging CT (image) into a comprehensive feature to yield multi-modal feature-fusion representation, which overcomes the representative ability limitation of single-modal data. The incorporation of interaction information constrains the feature generation to ensure symptom consistency between corrected and target CT. Then, we design an edge-enhance sub-network to avoid second artifact and suppress noise. Besides, we invite three professional physicians to evaluate corrected CT image subjectively. In this paper, We achieved average increment of 11.3% PSNR and 12.1% SSIM on DeepLesion dataset. The subjective evaluations by physicians show that ours outperforms over 6.3%, 7.1%, 5.50% and 6.9% in term of sharpness, resolution, invariance and acceptability, respectively. Our proposed method can achieve high-quality metal artifact reduction results.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.