Abstract
The ability of neural networks to perform pattern recognition, classification and associative memory, is essential to applications such as image and speech recognition, natural language understanding, decision making etc. In spiking neural networks (SNNs), information is encoded as sparsely distributed train of spikes, which allows learning through the spike-timing dependent plasticity (STDP) property. SNNs can potentially achieve very large scale implementation and distributed learning due to the inherent asynchronous and sparse inter-neuron communications. In this work, we develop an efficient, scalable and flexible SNN simulator, which supports learning through STDP. The simulator is ideal for biologically inspired neuron models for computation but not for biologically realistic models. Bayesian neuron model for SNNs that is capable of online and fully-distributed STDP learning is introduced. The function of the simulator is validated using two networks representing two different applications from unsupervised feature extraction to inference based sentence construction.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.