Abstract
Magnetic resonance sounding (MRS) measurements commonly suffer from a notably low signal-to-noise ratio (SNR). In recent years, many denoising methods have been developed, which have demonstrated the useful capability to improve the SNR. However, when MRS measurements are implemented on the sites with high noise levels, i.e., the human living environment, the conventional methods are helpless to recover the effective MRS signal, which is submerged in extensive environmental noise. In addition, the conventional methods that depend on signal models and the corresponding prior assumptions commonly rely on manual experience, which brings obstacles to the automation and efficiency of signal processing. To solve the above problems, we attempt to apply an intelligent denoising framework with a novel neural network as the basic tool for deep learning, called Dn-ResUnet in this article. The network extracts the features of the MRS signal through the encoder and decoder layers of the Dn-ResUnet structure, in which residual learning is adopted to accelerate the training process and improve denoising performance. Once the training is completed, the deep learning realizes adaptive denoising with no need for 1) prior assumptions of the MRS signal and noise; 2) optimal filter parameter tuning; or 3) expensive time cost. The comparison experiments demonstrate that the Dn-ResUnet model provides superior noise cancellation performance, especially it can replace the conventional methods to recover the effective MRS signals in noise levels down to an SNR of −30 dB. In addition, the noise attenuation tests are performed on synthetic and real data. The results show that the framework of deep learning yields a convincing performance in MRS signal denoising.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Geoscience and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.