Abstract

To improve the extraction ability of image features, reduce the complexity of model parameters, and enhance the reconstruction effect of image super-resolution (SR), a structured fusion attention network (SFAN) is proposed. Firstly, the deep convolution method is used to extract shallow features from low-resolution images, and different residual attention modules are considered to improve the structured residual of the encoder to extract more image features. Secondly, the features output by the encoder are refined, and the spatial attention module and the channel attention module are reorganized according to an improved fusion attention method to provide better input features for PixelShuffle, thus achieving the effect of reconstructing the decoder. Finally, through adding low-frequency inputs and network predictions, the input image is directly interpolated into the target, thereby accelerating the convergence of the network’s high-frequency residual and improving the image reconstruction effect. Under the condition of reconstruction magnification of <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$\times 2$ </tex-math></inline-formula> , <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$\times 3$ </tex-math></inline-formula> , <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$\times 4$ </tex-math></inline-formula> , SFAN is compared with some of the most advanced SR networks in the public data sets of Set5, Set14, BSD100, Urban100 and Manga 109. The experimental results show that SFAN has the best PSNR and SSIM values with low model parameters, thus proving that SFAN can achieve a good balance between the performance of SR and the complexity of parameters.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.