Abstract

Spike-timing-dependent plasticity (STDP) is one of the most popular and deeply biologically motivated forms of unsupervised Hebbian-type learning. In this article, we propose a variant of STDP extended by an additional activation-dependent scale factor. The consequent learning rule is an efficient algorithm, which is simple to implement and applicable to spiking neural networks (SNNs). It is demonstrated that the proposed plasticity mechanism combined with competitive learning can serve as an effective mechanism for the unsupervised development of receptive fields (RFs). Furthermore, the relationship between synaptic scaling and lateral inhibition is explored in the context of the successful development of RFs. Specifically, we demonstrate that maintaining a high level of synaptic scaling followed by its rapid increase is crucial for the development of neuronal mechanisms of selectivity. The strength of the proposed solution is assessed in classification tasks performed on the Modified National Institute of Standards and Technology (MNIST) data set with an accuracy level of 94.65% (a single network) and 95.17% (a network committee)-comparable to the state-of-the-art results of single-layer SNN architectures trained in an unsupervised manner. Furthermore, the training process leads to sparse data representation and the developed RFs have the potential to serve as local feature detectors in multilayered spiking networks. We also prove theoretically that when applied to linear Poisson neurons, our rule conserves total synaptic strength, guaranteeing the convergence of the learning process.

Highlights

  • P OWERFUL computation with low-power consumption is one of the key distinctive features of mammalian brains when compared with contemporary computing devices

  • The development of receptive fields (RFs) and the overall brain plasticity can be addressed to some extent by the learning rule proposed by Hebb [3], who emphasized the importance of the correlation between presynaptic and postsynaptic activity in the process of strengthening the synaptic efficacy between two neurons

  • A Hebbian learning (HL) rule in its classical formulation [3] postulates the occurrence of long-term potentiation (LTP) when a presynaptic neuron either continuously or in a repeatable manner triggers firing of a postsynaptic neuron

Read more

Summary

INTRODUCTION

P OWERFUL computation with low-power consumption is one of the key distinctive features of mammalian brains when compared with contemporary computing devices. The development of RFs and the overall brain plasticity can be addressed to some extent by the learning rule proposed by Hebb [3], who emphasized the importance of the correlation between presynaptic and postsynaptic activity in the process of strengthening the synaptic efficacy between two neurons. A Hebbian learning (HL) rule in its classical formulation [3] postulates the occurrence of LTP when a presynaptic neuron either continuously or in a repeatable manner triggers firing of a postsynaptic neuron Such a learning rule creates a positive feedback mechanism, which increases the efficacy of synaptic weights and enables the growth of neuronal activity. In the sole presence of the weight efficacy increasing mechanism, a neuron would at some point become responsive to all possible input patterns and would suffer from the explosive growth of the firing rate. A proof of convergence of the proposed learning rule is presented in the Appendix

MOTIVATION AND CONTRIBUTION
RELATED WORK
Rate-Based Models
Spike-Based Models
Network Architecture and Neuron Model
Mathematical Model of Plasticity
Plasticity Implementation Details
EXPERIMENTAL EVALUATION
Classification With RF Networks
Influence of Competitiveness on RFs Development
Method Evaluation in Larger Networks
DISCUSSION
Time Efficiency
Ablation Study
Possible Research Prospects
CONCLUSION
Proof of Learning Stability
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call