Abstract
As two different modalities of medical images, Magnetic Resonance (MR) and Computer Tomography (CT), provide mutually-complementary information to doctors in clinical applications. However, to obtain both images sometimes is cost-consuming and unavailable, particularly for special populations. For example, patients with metal implants are not suitable for MR scanning. Also, it is probably infeasible to acquire multi-contrast MR images during once clinical scanning. In this context, to synthesize needed MR images for patients whose CT images are available becomes valuable. To this end, we present a novel generative network, called CAE-ACGAN, which incorporates the advantages of Variational Auto-Encoder (VAE) and Generative Adversarial Network (GAN) with an auxiliary discriminative classifier network. We apply this network to synthesize multi-contrast MR images from single CT and conduct experiments on brain datasets. Our main contributions can be summarized as follows: 1)We alleviate the problems of images blurriness and mode collapse by integrating the advantages of VAE and GAN; 2) We solve the complicated cross-domain, multi-contrast MR synthesis task using the proposed network; 3) The technique of random-extraction-patches is used to lower the limit of insufficient training data, enabling to obtain promising results even with limited available data; 4) By comparing with other typical networks, we are able to yield nearer-real, higher-quality synthetic MR images, demonstrating the effectiveness and stability of our proposed network.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.