Abstract
In recent years, the convolutional neural networks (CNNs) for single image super-resolution (SISR) are becoming more and more complex, and it is more challenging to improve the SISR performance. In contrast, the reference image guided super-resolution (RefSR) is an effective strategy to boost the SR (super-resolution) performance. In RefSR, the introduced high-resolution (HR) references can facilitate the high-frequency residual prediction process. According to the best of our knowledge, the existing CNN-based RefSR methods treat the features from the references and the low-resolution (LR) input equally by simply concatenating them together. However, the HR references and the LR inputs contribute differently to the final SR results. Therefore, we propose a progressive channel attention network (PCANet) for RefSR. There are two technical contributions in this paper. First, we propose a novel channel attention module (CAM), which estimates the channel weighting parameter by weightedly averaging the spatial features instead of using global averaging. Second, considering that the residual prediction process can be improved when the LR input is enriched with more details, we perform super-resolution progressively, which can take advantage of the reference images in multi-scales. Extensive quantitative and qualitative evaluations on three benchmark datasets, which represent three typical scenarios for RefSR, demonstrate that our method is superior to the state-of-the-art SISR and RefSR methods in terms of PSNR (Peak Signal-to-Noise Ratio) and SSIM (Structural Similarity).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.