Abstract

Magnetic Resonance Imaging (MRI) has been widely applied in medical clinical diagnosis. Generally, obtaining a high spatial resolution MR image takes up to tens of minutes long. Reconstructing MR images from the undersampled k-space data has been playing a crucial role to accelerate MRI. Especially, the deep Convolutional Neural Networks (CNNs) have shown potential to significantly accelerate MRI. However, the receptive field size of CNNs is relatively small and it fails to capture the long-range dependencies. Nowadays, the non-local attention has been successfully applied in vision tasks due to the advantages in capturing long-range dependencies. However, the existing non-local attention generally learns long-range interactions among spatial locations in the spatial domain. It rarely involves in the frequency domain, which are likely to lead to beneficial outcomes. Recently, there are investigations that start to combine Fourier Transform with deep neural networks. In this work, we consider to learn the long-range interactions from the perspective of frequency. Specifically, we design a novel Non-Local Fourier Attention (NLFA) that combines the self-attention mechanism with Fourier Transform to capture the long-range spatial dependencies in the frequency domain. Furthermore, a new deep Residual Non-Local Fourier Network (RNLFNet) constructed with the Non-Local Fourier Attention and Residual Blocks is proposed for accelerated MRI. Such framework focuses on learning the information from both the spatial and frequency domain, which enjoys benefits from modelling both local details and global context between the degraded MR image and ground truth image pairs. The proposed model is evaluated on the MICCAI grand challenge datasets and fastMRI datasets, which significantly boosts the MR image reconstruction performance.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.