Abstract

The non-differentiability of the spike activity has been a hindrance to the development of high-performance spiking neural networks (SNNs). Current learning algorithms mainly focus on achieving attractive SNNs based on surrogate gradient or conversion, yet their performance is still limited. The probability-based SNNs use the probabilistic mechanism to smooth out spike activity, showing a promising way for training SNNs. This work optimizes the probabilistic mechanism and proposes the probabilistic firing mechanism (PFM) for spiking neurons. PFM enables differentiable spike activity and can be adapted to a variety of spiking neurons. In addition, to eliminate the negative influence of probabilistic uncertainty, the attention discrimination mechanism (ADM) is proposed, which enables the neurons to respond efficiently by adaptively distinguishing the salient elements of the input current. By fusing PFM, ADM, and Leaky Integrate-and-Fire (LIF) neurons, we constructed the Probabilistic Attention Leaky Integrate-and-Fire (PALIF) neuron and Probabilistic Attention Spiking Neural Network (PASNN). Ablation studies confirm the effectiveness of PFM and ADM, and indicate that PASNN is suitable for low-latency scenarios. Experiments on both static image and neuromorphic datasets, including CIFAR10, CIFAR100, N-MNIST, and CIFAR10-DVS, demonstrate that PASNN achieves competitive performance in terms of accuracy and inference speed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call