Abstract

AbstractDespite the existence of various super‐resolution (SR) methods, most of them focus on designing models for specific upscaling factors rather than fully exploiting inter‐scale correlation to improve efficiency. In contrast, multi‐scale SR methods can effectively reduce the redundancy of network parameters by aggregating the feature extraction processes corresponding to multiple scales into a unified process. The aim of this study is to enhance the compactness and efficiency of the SR model. Thus, an efficient multi‐scale SR method called the diverse branch feature refinement network (DBFRN) is proposed. By decoupling the training process and inference process based on the idea of structural re‐parameterization, multi‐branch topology is adopted to enrich multi‐scale learning and merge branches to achieve efficient inference with equivalent effects. Specifically, two re‐parameterization strategies are designed and two corresponding feature refinement blocks for different feature levels in multi‐scale SR network. Extensive experiments demonstrate that the proposed multi‐scale SR method is effective and efficient, and it can outperform advanced single‐scale methods in terms of quantity and quality.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call