Abstract

Error functions are normally based on the distance between output spikes and target spikes in supervised learning algorithms for spiking neural networks (SNNs). Due to the discontinuous nature of the internal state of spiking neuron, it is challenging to ensure that the number of output spikes and target spikes kept identical in multispike learning. This problem is conventionally dealt with by using the smaller of the number of desired spikes and that of actual output spikes in learning. However, if this approach is used, information is lost as some spikes are neglected. In this paper, a probability-modulated timing mechanism is built on the stochastic neurons, where the discontinuous spike patterns are converted to the likelihood of generating the desired output spike trains. By applying this mechanism to a probability-modulated spiking classifier, a probability-modulated SNN (PMSNN) is constructed. In its multilayer and multispike learning structure, more inputs are incorporated and mapped to the target spike trains. A clustering rule connection mechanism is also applied to a reservoir to improve the efficiency of information transmission among synapses, which can map the highly correlated inputs to the adjacent neurons. Results of comparisons between the proposed method and popular the SNN algorithms showed that the PMSNN yields higher efficiency and requires fewer parameters.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call