Abstract

Recently, magnetic resonance (MR) imaging has been widely used in clinical treatment to assist doctors in diagnosis because of its high resolution of soft tissue and no radiation. However, limited by imaging equipment and imaging time, the resolution of MR imaging is low. The previous super-resolution algorithms used for MR images use paired datasets to generate down-sampled images by artificially assuming the image degradation process, and forming a sample pair to monitor and train the super-resolution network. However, the super-resolution network obtained by this way will cause serious degradation of the network performance in practical application. In this paper, we proposed DMN, a blind super-resolution framework based on domain migration, which decouples the blind super-resolution network with the idea of domain migration, and avoids the false detail generation problem of the blind super-resolution network in the MR images super-resolution task through independent training. At the same time, we propose a multi-level parallel transformer feature enhancement module (MPTB), which takes noise as the original input and embeds noise features step by step to enhance the fitting ability of the network while reducing the complexity of the network structure design. Besides, we also use the k-space loss function to form a constraint function, so that the domain migration network can fully consider the k-space characteristics of MR images. The experimental results on the MR images of the knee joint and brain in the FastMRI dataset show that the super-resolution performance of our proposed DMN network is better than the previous super-resolution networks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.