Abstract

Deep learning approaches have achieved remarkable performances in single image super-resolution (SISR) while the heavy memory consumption and computational complexity hinder their applications in real-world devices. We design a lightweight asymmetric dilation distillation network (ADDN) that cascades asymmetric dilation distillation modules (ADDMs) as feature extraction blocks to efficiently refine hierarchical features. In our design, asymmetric dilation residual block (ADRB) is connected by the ADDM in an information distillation manner. Specifically, the ADRB regulates the dilation factors to expand different multiples of receptive fields and substantially reduces the number of parameters simultaneously. Compared with the existing methods quantitatively and qualitatively, the proposed ADDN can achieve superior performance on four available benchmark datasets with much fewer parameters and memory storage.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call