Abstract
Magnetic resonance (MR) images often suffer from random noise pollution during image acquisition and transmission, which impairs disease diagnosis by doctors or automated systems. In recent years, many noise removal algorithms with impressive performances have been proposed. In this work, inspired by the idea of deep learning, we propose a denoising method named 3D-Parallel-RicianNet, which will combine global and local information to remove noise in MR images. Specifically, we introduce a powerful dilated convolution residual (DCR) module to expand the receptive field of the network and to avoid the loss of global features. Then, to extract more local information and reduce the computational complexity, we design the depthwise separable convolution residual (DSCR) module to learn the channel and position information in the image, which not only reduces parameters dramatically but also improves the local denoising performance. In addition, a parallel network is constructed by fusing the features extracted from each DCR module and DSCR module to improve the efficiency and reduce the complexity for training a denoising model. Finally, a reconstruction (REC) module aims to construct the clean image through the obtained noise deviation and the given noisy image. Due to the lack of ground-truth images in the real MR dataset, the performance of the proposed model was tested qualitatively and quantitatively on one simulated T1-weighted MR image dataset and then expanded to four real datasets. The experimental results show that the proposed 3D-Parallel-RicianNet network achieves performance superior to that of several state-of-the-art methods in terms of the peak signal-to-noise ratio, structural similarity index, and entropy metric. In particular, our method demonstrates powerful abilities in both noise suppression and structure preservation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.