Abstract
The suppression of motion artefacts from MR images is a challenging task. The purpose of this paper was to develop a standalone novel technique to suppress motion artefacts in MR images using a data-driven deep learning approach. A simulation framework was developed to generate motion-corrupted images from motion-free images using randomly generated motion profiles. An Inception-ResNet deep learning network architecture was used as the encoder and was augmented with a stack of convolution and upsampling layers to form an encoder-decoder network. The network was trained on simulated motion-corrupted images to identify and suppress those artefacts attributable to motion. The network was validated on unseen simulated datasets and real-world experimental motion-corrupted in vivo brain datasets. The trained network was able to suppress the motion artefacts in the reconstructed images, and the mean structural similarity (SSIM) increased from 0.9058 to 0.9338. The network was also able to suppress the motion artefacts from the real-world experimental dataset, and the mean SSIM increased from 0.8671 to 0.9145. The motion correction of the experimental datasets demonstrated the effectiveness of the motion simulation generation process. The proposed method successfully removed motion artefacts and outperformed an iterative entropy minimization method in terms of the SSIM index and normalized root mean squared error, which were 5-10% better for the proposed method. In conclusion, a novel, data-driven motion correction technique has been developed that can suppress motion artefacts from motion-corrupted MR images. The proposed technique is a standalone, post-processing method that does not interfere with data acquisition or reconstruction parameters, thus making it suitable for routine clinical practice.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.