Abstract
High-quality and high-resolution magnetic resonance (MR) images can provide more details for diagnosis and analyses. Recently, MR images guided neurosurgery has become an emerging technique in clinics. Unlike other medical imaging techniques, it is impossible to achieve both real-time imaging and high image quality in MR imaging. The real-time performance is closely related to the nuclear magnetic equipment itself as well as the collection strategy of the k space data. Optimizing the imaging time cost via the corresponding algorithm is harder than enhancing image quality. Further, in reconstructing low-resolution and noise-rich MR images, getting relatively high-definition and resolution MR images as references are difficult or impossible. In addition, the existing methods are restricted in learning the controllable functions under the supervision of known degradation types and levels. As a result, severely bad results are inevitable when the modeling assumptions are far apart from the actual situation. To address these problems, we propose a novel adaptive adjustment method based on real MR images via opinion-unaware measurements for real super-resolution (A2OURSR). It can estimate the degree of blur and noise from the test image itself using two scores. These two scores can be considered pseudo labels to train the adaptive adjustable degradation estimation module. Then, the outputs of the above model are used as the inputs of the conditional network to tweak the generated results. Thus, the results can be automatically adjusted via the whole dynamic model. Extensive experimental results show that the proposed A2OURSR is superior to state-of-the-art methods on benchmarks quantitatively and visually.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.