Abstract
The deep learning method is an efficient solution for improving the quality of undersampled magnetic resonance (MR) image reconstruction while reducing lengthy data acquisition. Most deep learning methods neglect the mutual constraints between the real and imaginary components of complex-valued k-space data. In this paper, a new complex-valued convolutional neural network, namely, Dense-U-Dense Net (DUD-Net), is proposed to interpolate the undersampled k-space data and reconstruct MR images. The proposed network comprises dense layers, U-Net, and other dense layers in sequence. The dense layers are used to simulate the mutual constraints between real and imaginary components, and U-Net performs feature sparsity and interpolation estimation for k-space data. Two MRI datasets were used to evaluate the proposed method: brain magnitude-only MR images and knee complex-valued k-space data. Several operations were conducted for data preprocessing. First, the complex-valued MR images were synthesized by phase modulation on magnitude-only images. Second, a radial trajectory based on the golden angle was used for k-space undersampling, whereby a reversible normalization method was proposed to balance the distribution of positive and negative values in k-space data. The optimal performance of DUD-Net was demonstrated based on a quantitative evaluation of inter-method and intra-method comparisons. When compared with other methods, significant improvements were achieved, PSNRs were increased by 10.78 and 5.74dB, whereas RMSEs were decreased by 71.53% and 30.31% for magnitude and phase image, respectively. It is concluded that DUD-Net significantly improves the performance of MR image reconstruction.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.