Abstract
Neural networks have received recent interest for reconstruction of undersampled MR acquisitions. Ideally, network performance should be optimized by drawing the training and testing data from the same domain. In practice, however, large datasets comprising hundreds of subjects scanned under a common protocol are rare. The goal of this study is to introduce a transfer-learning approach to address the problem of data scarcity in training deep networks for accelerated MRI. Neural networks were trained on thousands(upto 4 thousand) of samples from public datasets of either natural images or brain MR images. The networks were then fine-tuned using only tens of brain MR images in a distinct testing domain. Domain-transferred networks were compared to networks trained directly in the testing domain. Network performance was evaluated for varying acceleration factors (4-10), number of training samples (0.5-4k), and number of fine-tuning samples (0-100). The proposed approach achieves successful domain transfer between MR images acquired with different contrasts (T1 - and T2 -weighted images) and between natural and MR images (ImageNet and T1 - or T2 -weighted images). Networks obtained via transfer learning using only tens of images in the testing domain achieve nearly identical performance to networks trained directly in the testing domain using thousands(upto 4 thousand) of images. The proposed approach might facilitate the use of neural networks for MRI reconstruction without the need for collection of extensive imaging datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.