Abstract

In recent years, the weight binarized neural network (BNN) technology has made great progress. However, neural networks with binarized inputs and binarized weights suffer from low accuracy in pattern recognition or inefficiency in hardware implementation. This work proposes a spatio-temporal binary neural network (STBNN) to solve this problem. STBNN has binary network input/output, binary neuron input/output, and binarized weights, and it integrates the computationally expensive batch normalization (BN) operation widely used in previous BNNs into the neuron threshold. STBNN can largely save computing resources and storage space while maintaining high accuracy (e.g., 98.0% on the MNIST test set). Using binary input (0 or 1) and binarized weight (±1), the product of input and weight can be realized by a 1-bit Signed AND operation instead of multiplication operation in hardware implementation, thus significantly reducing computing resources, memory requirements, and power consumption. The results show that compared with a 32-bit multi-layer perceptron (MLP)-based hardware design, the STBNN-based hardware design typically reduces these three indicators by 84.2%, 96.4%, and 96.7%, respectively. This work provides an effective method to construct hardware-friendly neural network models and a guide for designing an extremely hardware-saving neural network processor.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call