Abstract
Deep Learning has been successfully applied to reconstruct Magnetic Resonance (MR) images from undersampled k-space data to achieve the acceleration of MRI. However, most existing works focused on 2D MRI scans and perform reconstruction in a slice-by-slice manner due to the tremendous computation and memory cost of 3D MRI reconstruction, resulting in a blank in the research of 3D MRI acceleration. Providing more accurate and diagnostic information, 3D MRI requires excessively long scan time. To accelerate 3D MRI, in this work, we design a lightweight and powerful cascaded 3D network DIR3D utilizing a previously proposed computation-friendly data processing strategy. Specifically, we propose an efficient block called Dual-Domain Inter-Scale Mutual Reinforcement Block (DIRB) to fuse multi-scale features locally and globally with neglectable computation and memory costs, which enhances the representative ability of the network. To allow more flexibility, we further redesign the commonly used Data Consistency (DC) layer by introducing a learnable adaptor which enables the network to perform point-wise adaptive merge of the reconstructed and sampled k-space data while ensuring data consistency. We conduct comprehensive experiments on the Stanford MRIData and evaluate our DIR3D from multiple perspectives. When achieving the same acceleration factors, our proposed DIR3D consistently outperforms other state-of-the-art 2D methods at multiple subsampling masks, especially for highly undersampled data, which provides strong evidence for the superiority of our DIR3D for 3D MRI acceleration. Additionally, at the inference stage, our DIR3D can achieve a much higher reconstruction efficiency.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.