Abstract

Although convolutional neural network-based methods have achieved significant performance improvement for Single Image Super-Resolution (SISR), their vast computational cost hinders real-world environment application. Thus, the interest in light networks for SISR is rising. Since existing SISR light models mainly focus on extracting fine local features using convolution operation, they have a limitation in that networks hardly capture global information. To capture the long-range dependency, Non-Local (NL) attention and Transformers have been explored in the SISR task. However, they are still suffering from a balancing problem between performance and computational cost. In this paper, we propose Fast Non-Local attention NETwork (FNLNET) for a super light SISR, which can capture the global representation. To acquire global information, we propose The Fast Non-Local Attention (FNLA) module that has low computational complexity while capturing global representation that reflects long-distance relationships between patches. Then, FNLA requires only 16 times lower computational cost than conventional NL networks while improving performance. In addition, we propose a powerful module called Global Self-Intension Mining (GSIM) that fuses the multi-information resources such as local, and global representation. Our FNLNET shows outstanding performance with fewer parameters and computational costs in the experiments on the benchmark datasets against state-of-the-art light SISR models.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.