Abstract

We propose a plug-and-play neural architecture search (NAS) method to explore diverse architectures for single image super-resolution (SISR). Unlike current NAS-based methods with the single path setting and pipeline setting, our proposed method achieves the trade-off between diverse network architectures and search cost. Our proposed method formulates the task in a differentiable manner, which inherits the architecture parameter optimization method from Discrete Stochastic Neural Architecture Search (DSNAS). Besides the straightforward searching of operations, we also search each node in a cell for the activation function, from-node, and skip-connection node, which diverse the searched architecture topologies. The individually searching of skip-connection node avoids skip-connection excessive phenomenon. Moreover, to alleviate the influence of inconsistent architecture between training and testing periods, we introduce random variables into the architecture parameter as regularization. Benchmark experiments show our state-of-the-art performance under specific parameters and FLOPs constraints. Compared with other NAS-based SISR methods, our proposed methods achieve better performance with less searching time and resources. The superior results further demonstrate the effectiveness of our proposed NAS methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call