Abstract

To accelerate T2 mapping with highly sparse sampling by integrating deep learning image priors with low-rank and sparse modeling. The proposed method achieves high-speed T2 mapping by highly sparsely sampling (k, TE)-space. Image reconstruction from the undersampled data was done by exploiting the low-rank structure and sparsity in the T2 -weighted image sequence and image priors learned from training data. The image priors for a single TE were generated from the public Human Connectome Project data using a tissue-based deep learning method; the image priors were then transferred to other TEs using a generalized series-based method. With these image priors, the proposed reconstruction method used a low-rank model and a sparse model to capture subject-dependent novel features. The proposed method was evaluated using experimental data obtained from both healthy subjects and tumor patients using a turbo spin-echo sequence. High-quality T2 maps at the resolution of 0.9 × 0.9 × 3.0 mm3 were obtained successfully from highly undersampled data with an acceleration factor of 8. Compared with the existing compressed sensing-based methods, the proposed method produced significantly reduced reconstruction errors. Compared with the deep learning-based methods, the proposed method recovered novel features better. This paper demonstrates the feasibility of learning T2 -weighted image priors for multiple TEs using tissue-based deep learning and generalized series-based learning. A new method was proposed to effectively integrate these image priors with low-rank and sparse modeling to reconstruct high-quality images from highly undersampled data. The proposed method will supplement other acquisition-based methods to achieve high-speed T2 mapping.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.