Abstract
High-quality images have an important effect on high-level tasks. However, due to human factors and camera hardware, digital devices collect low-resolution images. Deep networks can effectively restore these damaged images via their strong learning abilities. However, most of these networks depended on deeper architectures to enhance clarities of predicted images, where single features cannot deal well with complex screens. In this paper, we propose a dual super-resolution CNN (DSRCNN) to obtain high-quality images. DSRCNN relies on two sub-networks to extract complementary low-frequency features to enhance the learning ability of the SR network. To prevent a long-term dependency problem, a combination of convolutions and residual learning operation is embedded into dual sub-networks. To prevent information loss of an original image, an enhanced block is used to gather original information and obtained high-frequency information of a deeper layer via sub-pixel convolutions. To obtain more high-frequency features, a feature learning block is used to learn more details of high-frequency information. The proposed method is very suitable for complex scenes for image resolution. Experimental results show that the proposed DSRCNN is superior to other popular in SR networks. For instance, our DSRCNN has obtained improvement of 0.08 dB than that of MemNet on Set5 for ×3.
Highlights
Publisher’s Note: MDPI stays neutralDue to effects of human factor and camera hardware, captured images often are not clear
We propose a dual super-resolution convolutional neural networks (CNNs) (DSRCNN) via three blocks (i.e., two sub-network enhanced block (TSEB), enhanced block (EB), and feature learning block (FLB) to obtain high-quality images
To improve the speed of training DSRCNN, given LR images are cropped as an image patch with 64 × 64
Summary
Jiagang Song 1,† , Jingyu Xiao 2, *,† , Chunwei Tian 3,4,5 , Yuxuan Hu 2 , Lei You 6 and Shichao Zhang 2.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.