Abstract

The application of single-image superresolution (SISR) in remote sensing is of great significance. Although the state-of-the-art convolution neural network (CNN)-based SISR methods have achieved excellent results, the large model and slow speed make it difficult to deploy in real remote sensing tasks. In this article, we propose a compact and efficient distance attention residual network (DARN) to achieve a better compromise between model accuracy and complexity. The distance attention residual connection block (DARCB), the core component of the DARN, uses multistage feature aggregation to learn more accurate feature representations. The main branch of the DARCB adopts a shallow residual block (SRB) to flexibly learn residual information to ensure the robustness of the model. We also propose a distance attention block (DAB) as a bridge between the main branch and the branch of the DARCB; the DAB can effectively alleviate the loss of detail features in the deep CNN extraction process. Experimental results on two remote sensing and five super-resolution benchmark datasets demonstrate that the DARN achieves a better compromise than existing methods in terms of performance and model complexity. In addition, the DARN achieves the optimal solution compared with the state-of-the-art lightweight remote sensing SISR method in terms of parameter amount, computation amount, and inference speed. Our code will be available at <uri xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">https://github.com/candygogogogo/DARN</uri> .

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call