Abstract

Intrinsic plasticity (IP) is an unsupervised, self-adaptive, local learning rule that was first found in biological nerve cells, and has been shown to be able to maximize neuronal information transmission entropy. In this paper, we propose a soft-reset leaky integrate-and-fire (LIF) model, a spiking neuron model based on widely used LIF neurons, with a new IP learning rule that optimizes the neuronal membrane potential state to be exponentially distributed. Previous studies have generally used such as spiking neuron expected firing rate as the target variable to maximize output spike distribution. In contrast, the proposed soft-reset model can avoid the problem that conventional LIF neuronal membrane potential is not fully differentiable, hence the proposed IP rule can directly regulate the membrane potential as an auxiliary “output signal” to desired distribution to maximize its information entropy. We experimentally evaluated the proposed IP rule for pattern recognition on the spiking feed-forward and spiking convolutional neural network models. Experimental results verified that the proposed IP rule can effectively improve spiking neural network computational performance in terms of classification accuracy, spiking inference speed, and noise robustness.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call