Abstract
In recent years, the convolutional neural networks (CNNs) have been successfully applied to single image super-resolution (SISR) task. However, most of the CNN based SISR methods obtain better performance with a huge amount of training parameters which increases the computational complexity of their SISR models. Such SR networks suffer from a heavy burden on computational assets and as a result, they are no longer appropriate for many real-world applications. Hence, in the computer vision community, it is an interest to endorse an SR approach which makes use of less number of training parameters with better SR performance. In this paper, we propose a computationally efficient SR approach called enhanced progressive super-resolution network i.e., E-ProSRNet. This approach is the enhanced version of our base proposed model called ProSRNet. In E-ProSRNet model, we propose a novel enhanced parallel densely connected residual network (E-PDRN) which helps to extract rich features in the low-resolution (LR) observation. The SR performance of proposed E-ProSRNet model is better than that of ProSRNet and it uses a less number of training parameters when compared to that of ProSRNet model. The experimental analysis on common testing benchmark datasets shows that the proposed E-ProSRNet sets new state-of-the-art performance on SISR task for upscaling factor ×4. The E-ProSRNet method obtains better SR performance when compared to that obtained using proposed ProSRNet as well as the other state-of-the-art methods with significant reduction in the computational complexity.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.