Abstract

Spiking neural networks (SNNs) become popular choices for processing spatiotemporal input data and enabling low‐power event‐driven spike computation on neuromorphic processors. However, direct SNN training algorithms are not well compatible with error back‐propagation process, while indirect conversion algorithms based on artificial neural networks (ANNs) are usually accuracy–lossy due to various approximation errors. Both of them suffer from lower accuracies compared with their reference ANNs and need lots of time steps to achieve stable performance in deep architectures. In this article, a novel conversion framework is presented for deep SNNs with negative‐spike dynamics, which takes a quantization constraint and spike compensation technique into consideration during ANN‐to‐SNN conversion, and a truly lossless accuracy performance with their ANN counterparts is obtained. The converted SNNs can retain full advantages of simple leaky‐integrate‐and‐fire spiking neurons and are very suited for hardware implementation. In the experimental results, it is shown that converted spiking LeNet on MNIST/FashionMNIST and VGG‐Net on CIFAR‐10 dataset yield the state‐of‐the‐art classification accuracies with quite shortened computing time steps and much fewer synaptic operations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call