Abstract

Magnetic Resonance Imaging (MRI) reconstruction has made significant progress with the introduction of Deep Learning (DL) technology combined with Compressed Sensing (CS). However, most existing methods require large fully sampled training datasets to supervise the training process, which may be unavailable in many applications. Current unsupervised models also show limitations in performance or speed and may face unaligned distributions during testing. This paper proposes an unsupervised method to train competitive reconstruction models that can generate high-quality samples in an end-to-end style. Firstly teacher models are trained by filling the re-undersampled images and compared with the undersampled images in a self-supervised manner. The teacher models are then distilled to train another cascade model that can leverage the entire undersampled k-space during its training and testing. Additionally, we propose an adaptive distillation method to re-weight the samples based on the variance of teachers, which represents the confidence of the reconstruction results, to improve the quality of distillation. Experimental results on multiple datasets demonstrate that our method significantly accelerates the inference process while preserving or even improving the performance compared to the teacher model. In our tests, the distilled models show 5%-10% improvements in PSNR and SSIM compared with no distillation and are 10 times faster than the teacher.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.