Abstract
The outstanding performance of deep convolutional neural networks comes from their effective extraction and learning ability. Although binary neural networks (BNNs) have the obvious advantages of low storage and high efficiency over their full-precision counterparts on resource-constrained hardware devices, the accuracy degradation brought by binary quantization is still an unavoidable problem. The activation distribution in BNNs is a key factor affecting network performance. To elevate the accuracy of BNNs, in this paper, we propose to regulate the activation distribution to strengthen the representation ability of BNNs. We first propose an Information Entropy enhancement Basic block (IEBlock) to build a competitive baseline model with higher information entropy of output activation distribution. Specifically, we build the IEBlock by deliberately reorganizing the position of the elements in the normal basic block based on a deep analysis of the information flow. After that, we propose a Depth-aware Activation Distribution Amendment (DADA) module, which learns the interdependencies of feature channels to amend the activation distribution with information loss after binary convolution. Extensive experiments demonstrate that our method effectively improves the information entropy of binary activations and elevates the accuracy of BNNs. Our method has outperformed the state-of-the-art methods on CIFAR-10 and ImageNet datasets. Code is available at: https://github.com/tomorrow-rain/RAD-BNN
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.