Abstract

Spiking neural networks (SNNs) have emerged as a promising alternative to ANNs for their energy efficiency on resource-constrained devices via event-driven sparsity. However, the state-of-art SNNs utilize longer time steps and full precision weights to implement complex image classification applications, which leads to significant latency and computational cost for hardware implementation. To this end, this paper leverages the surrogate gradient-based SNN model and threshold-based ternary weight paradigm to combine the efficiency gains of spike input {0, 1} and discrete ternary weight {−1, 0, 1}. In this manner, the internal state of SNN can be accelerated by binary-ternary dot product to replace multiply and accumulation operations. Moreover, binary-ternary dot products can design as gated AND networks (GAND-Nets). Since only the event-driven non-zero activation enables the control gate to start the AND logic operations, which towards energy-efficient edge intelligence. For proof-of-concept, we evaluate the proposed GAND-Nets on CIFAR-10, CIFAR-100, and NMNIST datasets with stochastic rate encoding, which achieve 87.42%, 63.42% and 98.43% accuracy with fewer time step, and provide 1 bit-width binary-ternary dot product accelerations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call