Binary neural network (BNN) is an effective approach to reduce the memory usage and the computational complexity of full-precision convolutional neural networks (CNNs), which has been widely used in the field of deep learning. However, there are different properties between BNNs and real-valued models, making it difficult to draw on the experience of CNN composition to develop BNN. In this article, we study the application of binary network to the single-image super-resolution (SISR) task in which the network is trained for restoring original high-resolution (HR) images. Generally, the distribution of features in the network for SISR is more complex than those in recognition models for preserving the abundant image information, e.g., texture, color, and details. To enhance the representation ability of BNN, we explore a novel activation-rectified inference (ARI) module that achieves a more complete representation of features by combining observations from different quantitative perspectives. The activations are divided into several parts with different quantification intervals and are inferred independently. This allows the binary activations to retain more image detail and yield finer inference. In addition, we further propose an adaptive approximation estimator (AAE) for gradually learning the accurate gradient estimation interval in each layer to alleviate the optimization difficulty. Experiments conducted on several benchmarks show that our approach is able to learn a binary SISR model with superior performance over the state-of-the-art methods. The code will be released at https://github.com/jwxintt/Rectified-BSR.
Read full abstract