Abstract

Spiking neural networks (SNNs) transmit information through discrete spikes that perform well in processing spatial–temporal information. Owing to their nondifferentiable characteristic, difficulties persist in designing SNNs that deliver good performance. SNNs trained with backpropagation have recently exhibited impressive performance by using gradient approximation. However, their performance on complex tasks remains significantly inferior to that of deep neural networks. By taking inspiration from autapses in the brain that connect spiking neurons with a self-feedback connection, we apply adaptive time-delayed self-feedback to the membrane potential to regulate the precision of the spikes. We also strike a balance between the excitatory and inhibitory mechanisms of neurons to dynamically control the output of spiking neurons. By combining these two mechanisms, we propose a deep SNN with adaptive self-feedback and balanced excitatory and inhibitory neurons (BackEISNN). The results of experiments on several standard datasets show that the two modules not only accelerate the convergence of the network but also increase its accuracy. Our model achieved state-of-the-art performance on the MNIST, Fashion-MNIST, and N-MNIST datasets. The proposed BackEISNN also achieved remarkably good performance on the CIFAR10 dataset while using a relatively light structure that competes against state-of-the-art SNNs.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call