Abstract
To introduce a novel deep learning method for Robust and Accelerated Reconstruction (RoAR) of quantitative and B0-inhomogeneity-corrected maps from multi-gradient recalled echo (mGRE) MRI data. RoAR trains a convolutional neural network (CNN) to generate quantitative maps free from field inhomogeneity artifacts by adopting a self-supervised learning strategy given (a) mGRE magnitude images, (b) the biophysical model describing mGRE signal decay, and (c) preliminary-evaluated F-function accounting for contribution of macroscopic B0 field inhomogeneities. Importantly, no ground-truth images are required and F-function is only needed during RoAR training but not application. We show that RoAR preserves all features of maps while offering significant improvements over existing methods in computation speed (seconds vs. hours) and reduced sensitivity to noise. Even for data with SNR = 5 RoAR produced maps with accuracy of 22% while voxel-wise analysis accuracy was 47%. For SNR = 10 the RoAR accuracy increased to 17% vs. 24% for direct voxel-wise analysis. RoAR is trained to recognize the macroscopic magnetic field inhomogeneities directly from the input magnitude-only mGRE data and eliminate their effect on measurements. RoAR training is based on the biophysical model and does not require ground-truth maps. Since RoAR utilizes signal information not just from individual voxels but also accounts for spatial patterns of the signals in the images, it reduces the sensitivity of maps to the noise in the data. These features plus high computational speed provide significant benefits for the potential usage of RoAR in clinical settings.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.