Abstract

In recent years, deep unfolding networks (DUNs) have received widespread attention in the field of compressed sensing (CS) reconstruction due to their good interpretability and strong mapping capabilities. However, existing DUNs often improve the reconstruction effect at the expense of a large number of parameters, and there is the problem of information loss in long-distance feature transmission. Based on the above problems, we propose an unfolded network architecture that mixes Transformer and large kernel convolution to achieve sparse sampling and reconstruction of natural images, namely, a reconstruction network based on Transformer and convolution (TCR-Net). The Transformer framework has the inherent ability to capture global context through a self-attention mechanism, which can effectively solve the challenge of long-range dependence on features. TCR-Net is an end-to-end two-stage architecture. First, a data-driven pre-trained encoder is used to complete the sparse representation and basic feature extraction of image information. Second, a new attention mechanism is introduced to replace the self-attention mechanism in Transformer, and a hybrid Transformer and convolution module based on optimization-inspired is designed. Its iterative process leads to the unfolding framework, which approximates the original image stage by stage. Experimental results show that TCR-Net outperforms existing state-of-the-art CS methods while maintaining fast computational speed. Specifically, when the CS ratio is 0.10, the average PSNR on the test set used in this paper is improved by at least 0.8%, the average SSIM is improved by at least 1.5%, and the processing speed is higher than 70FPS. These quantitative results show that our method has high computational efficiency while ensuring high-quality image restoration.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.