Abstract
The medical image super-resolution (SR) reconstruction algorithm based on deep learning with standard 3D convolution has too many network parameters. This leads to high computational complexity and low network training efficiency. To address these problems, a lightweight densely-connected residual 3D-CNN (P3DSRNet) combining dense residual connection with pseudo-3D convolution is proposed. Firstly, the dense residual block is used to widen the channel of the convolution layer. Hence, more feature information is transmitted to the activation function, so that the low-layer image features are spread to the higher layers to improve medical image super-resolution. Then, the pseudo-3D separable convolution strategy is employed to train the network, where the standard 3D convolution kernel is separated into multiple convolution kernels. The network is trained stage by stage, so network training converges faster. It effectively solved the problem that the number of parameters increase sharply with the added network training difficulty due to the widened dimension of standard 3D convolution. The experimental results show that the medical images reconstructed by the P3DSRNet model have clearer texture details and better visual effects, compared with those by the traditional interpolation and LRTV super-resolution algorithms. The P3DSRNet model has fewer network parameters, compared with the SRCNN3D and ReCNN super-resolution algorithms. In addition, the SR image PSNRs increase by 1.88 dB, 0.30 dB, and the SSIMs increase by 0.009 6, 0.001 1, respectively, compared with SRCNN3D and ReCNN. The P3DSRNet not only reduces the number of parameters and computational complexity, but also improves the super-resolution performances of medical images.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Journal of Computer-Aided Design & Computer Graphics
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.