Abstract

Magnetic resonance imaging (MRI) has the limitation of low imaging speed. Acceleration methods using under-sampled k-space data have been widely exploited to improve data acquisition without reducing the image quality. Sensitivity encoding (SENSE) is the most commonly used method for multi-channel imaging. However, SENSE has the drawback of severe g-factor artifacts when the under-sampling factor is high. This paper applies generative adversarial networks (GAN) to remove g-factor artifacts from SENSE reconstructions. Our method was evaluated on a public knee database containing 20 healthy participants. We compared our method with conventional GAN using zero-filled (ZF) images as input. Structural similarity (SSIM), peak signal to noise ratio (PSNR), and normalized mean square error (NMSE) were calculated for the assessment of image quality. A paired student's t-test was conducted to compare the image quality metrics between the different methods. Statistical significance was considered at P<0.01. The proposed method outperformed SENSE, variational network (VN), and ZF + GAN methods in terms of SSIM (SENSE + GAN: 0.81±0.06, SENSE: 0.40±0.07, VN: 0.79±0.06, ZF + GAN: 0.77±0.06), PSNR (SENSE + GAN: 31.90±1.66, SENSE: 22.70±1.99, VN: 31.35±2.01, ZF + GAN: 29.95±1.59), and NMSE (×10-7) (SENSE + GAN: 0.95±0.34, SENSE: 4.81±1.33, VN: 0.97±0.30, ZF + GAN: 1.60±0.84) with an under-sampling factor of up to 6-fold. This study demonstrated the feasibility of using GAN to improve the performance of SENSE reconstruction. The improvement of reconstruction is more obvious for higher under-sampling rates, which shows great potential for many clinical applications.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.