Abstract
To develop a deep learning-based method for rapid liver proton-density fat fraction (PDFF) and R2 * quantification with built-in uncertainty estimation using self-gated free-breathing stack-of-radial MRI. This work developed an uncertainty-aware physics-driven deep learning network (UP-Net) to (1) suppress radial streaking artifacts because of undersampling after self-gating, (2) calculate accurate quantitative maps, and (3) provide pixel-wise uncertainty maps. UP-Net incorporated a phase augmentation strategy, generative adversarial network architecture, and an MRI physics loss term based on a fat-water and R2 * signal model. UP-Net was trained and tested using free-breathing multi-echo stack-of-radial MRI data from 105 subjects. UP-Net uncertainty scores were calibrated in a validation dataset and used to predict quantification errors for liver PDFF and R2 * in a testing dataset. Compared with images reconstructed using compressed sensing (CS), UP-Net achieved structural similarity index >0.87 and normalized root mean squared error <0.18. Compared with reference quantitative maps generated using CS and graph-cut (GC) algorithms, UP-Net achieved low mean differences (MD) for liver PDFF (-0.36%) and R2 * (-0.37 s-1 ). Compared with breath-holding Cartesian MRI results, UP-Net achieved low MD for liver PDFF (0.53%) and R2 * (6.75 s-1 ). UP-Net uncertainty scores predicted absolute liver PDFF and R2 * errors with low MD of 0.27% and 0.12 s-1 compared to CS + GC results. The computational time for UP-Net was 79 ms/slice, whereas CS + GC required 3.2 min/slice. UP-Net rapidly calculates accurate liver PDFF and R2 * maps from self-gated free-breathing stack-of-radial MRI. The pixel-wise uncertainty maps from UP-Net predict quantification errors in the liver.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.