Abstract

As the memory footprint requirement and computational scale concerned, the light-weighted Binary Neural Networks (BNNs) have great advantages in limited-resources platforms, such as AIoT (Artificial Intelligence in Internet of Things) edge terminals, wearable and portable devices, etc. However, the binarization process naturally brings considerable information losses and further deteriorates the accuracy. In this article, three aspects are introduced to better the binarized ReActNet accuracy performance with a more low-complex computation. Firstly, an improved Binarized Ghost Module (BGM) for the ReActNet is proposed to increase the feature maps information. At the same time, the computational scale of this structure is still kept at a very low level. Secondly, we propose a new Label-aware Loss Function (LLF) in the penultimate layer as a supervisor which takes the label information into consideration. This auxiliary loss function makes each category&#x2019;s feature vectors more separate, and improve the final fully-connected layer&#x2019;s classification accuracy accordingly. Thirdly, the Normalization-based Attention Module (NAM) method is adopted to regulate the activation flow. The module helps to avoid the gradient saturation problem. With these three approaches, our improved binarized network outperforms the other state-of-the-art methods. It can achieve 71.4% Top-1 accuracy on the ImageNet and 86.45% accuracy on the CIFAR-10 respectively. Meanwhile, its computational scale OPs is the least 0.86&#x00D7;10<sup>8</sup> compared with the other mainstream BNN models. The experimental results prove the effectiveness of our proposals, and the study is very helpful and promising for the future low-power hardware implementations.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.