Abstract

Binary neural networks (BNNs) have drawn much attention because of the most promising techniques to meet the desired memory footprint and inference speed requirements. However, they still suffer from the severe intrinsic instability of the error convergence, resulting in increase in prediction error and its standard deviation, which is mostly caused by the inherently poor representation with only two possible values of [Formula: see text]1 and [Formula: see text]1. In this work, we have proposed a cost-aware layer-wise ensemble method to address the above issue without incurring any excessive costs, which is characterized by (1) layer-wise bagging and (2) cost-aware layer selection for the bagging. One of the experimental results has shown that the proposed method reduces the error and its standard deviation by 15% and 54% on CIFAR-10, respectively, compared to the BNN serving as a baseline. This paper demonstrated and discussed such error reduction and stability performance with high versatility based on the comparison results under the various cases of combinations of the network base model with the proposed and the state-of-the-art prior techniques while changing the network sizes and datasets of CIFAR-10, SVHN, and MNIST for the evaluation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.