Abstract
To develop a model-based deep neural network for high-quality image reconstruction of undersampled multi-coil CEST data. Inspired by the variational network (VN), the CEST image reconstruction equation is unrolled into a deep neural network (CEST-VN) with a k-space data-sharing block that takes advantage of the inherent redundancy in adjacent CEST frames and 3D spatial-frequential convolution kernels that exploit correlations in the x-ω domain. Additionally, a new pipeline based on multiple-pool Bloch-McConnell simulations is devised to synthesize multi-coil CEST data from publicly available anatomical MRI data. The proposed network is trained on simulated data with a CEST-specific loss function that jointly measures the structural and CEST contrast. The performance of CEST-VN was evaluated on four healthy volunteers and five brain tumor patients using retrospectively or prospectively undersampled data with various acceleration factors, and then compared with other conventional and state-of-the-art reconstruction methods. The proposed CEST-VN method generated high-quality CEST source images and amide proton transfer-weighted maps in healthy and brain tumor subjects, consistently outperforming GRAPPA, blind compressed sensing, and the original VN. With the acceleration factors increasing from 3 to 6, CEST-VN with the same hyperparameters yielded similar and accurate reconstruction without apparent loss of details or increase of artifacts. The ablation studies confirmed the effectiveness of the CEST-specific loss function and data-sharing block used. The proposed CEST-VN method can offer high-quality CEST source images and amide proton transfer-weighted maps from highly undersampled multi-coil data by integrating the deep learning prior and multi-coil sensitivity encoding model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.