Dense connection has proved an effective way to take full advantage of hierarchical features extracted from low-resolution images using deep neural networks (DNNs) for single image super-resolution (SISR). However, existing densely connected networks are often manually designed and overly depended on practical experience, thus leading to suboptimal performance. Moreover, due to their complicated connections, they are memory-consuming and lack of interpretability. To address all these problems with one stone, we propose to construct a new densely connected network for SISR from a dynamic system perspective. Following this idea, we cast the hierarchical transformation of DNNs into the state evolution of a fractional-order dynamic system, which empowers us to automatically construct two interdependent densely connected modules based on the system solution rather manual design. They are a prediction module that controls the system to iteratively predict the next state, and a correction module that iteratively refines the predicted state to improve the prediction accuracy. With these two modules as backbone, we establish a Fractional-order Differential Equations-based network (FDE-Net) for SISR. Since the iterative computational manner requires both densely connected modules to be of the recurrent structure, FDE-Net is memory-efficiency and good interpretability. In addition, we analyze the existence and uniqueness of the solution for FDE to theoretically guarantee the feasibility of FDE-Net. Experiments on four SISR benchmark datasets demonstrate the superiority of FDE-Net over existing densely connected networks and other baselines in terms of generalization capacity, especially with limited memory.
Read full abstract