Abstract
Although deep learning is widely used in the field of image super-resolution reconstruction, the number of network layers and the complexity of deep learning models continue to increase, and the speed at which the difficulty of model training increases has exceeded the speed of network performance improvement. This situation prevents deep learning frameworks from fully utilizing their generalization ability in image super-resolution reconstruction. Moreover, the existing image reconstruction methods have the problem of partial information loss. Therefore, this paper first proposes a structure that simulates the feature extraction function in the visual attention mechanism in the convolutional neural network. We call this a significant network connection. The feature information extracted by the network architecture is more significant, while other information is less significant. This setup has a reduced impact on the resulting image reconstruction effect. Then, a network architecture focusing on collaborative information migration is proposed. This architecture can obtain the implicit domain of the intermediate state of the image domain to be reconstructed, and it can make the two networks use the learned hidden domain during the reconstruction process. The dual networks trained in this way are more symmetrical. This approach can better maintain the common feature information of the reconstructed image and effectively solve the problem of partial feature information loss in the image. The experimental results show that the texture, artificial effects and noise of the reconstructed image obtained by the method proposed in this paper are significantly improved over those of the images produced by other mainstream methods. In addition, the method proposed in this paper exhibits a certain degree of improvement over other deep learning methods in terms of model training speed and feature information retention.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.