Abstract

ObjectiveThe study of deep learning-based fast magnetic resonance imaging (MRI) reconstruction methods has become popular in recent years. However, it is still challenging to obtain high-fidelity MRI images from undersampled data with large acceleration factors. The objective of this study was to improve the reconstruction quality of under-sampled MR images by exploring the correlation between pixels in the image and the perceptual similarities between the initial reconstructed images and the fully sampled ones. MethodsExisting reconstruction methods often suffer from aliasing artefacts and blurry texture information. We propose a novel perceptual residual self-attention generative adversarial model, SRSAGAN, to tackle this issue for fast and accurate MRI reconstruction. The proposed method involves two operation steps. In step 1, leveraging the average information and similarity weights of high-level features, we designed a novel feature enhancement module based on residual self-attention to restore fine texture details in the reconstructed image. In step 2, to improve the accuracy of network inference, we compare the perceptual similarities between the initial reconstructed images and the fully sampled ones. ResultThe proposed algorithm was evaluated using the fastMRI dataset. It demonstrates that SRSAGAN outperforms state-of-art methods and reconstructs complex image texture with only 10 % retained the k-space raw data (10-fold acceleration). The reconstruction time is only about 7.78 ms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.