Abstract
This paper describes a digital silicon neuronal network trained by the Hebbian learning rule that can execute the auto-associative memory. In our previous work, a fully connected network of 256 silicon neurons based on the digital spiking silicon neuron (DSSN) model and kinetic-model-based silicon synapses were implemented. In this work, we added circuit modules that append Hebbian learning function and fitted it to a Xilinx Virtex 6 XC6VSX315T FPGA device. The performances of auto-associative memory with several spike-time-dependent Hebbian learning rules and the correlation rule are compared. The results show that Hebbian learning rules that model both synaptic potentiation and depression improve the retrieval probability in our silicon neuronal network.
Highlights
Sensory experiences are thought to be configuring the nerve system
The modification that lasts for long time has two types: long-term potentiation (LTP) and long-term depression (LTD) which respectively leads to reinforced and weakened synapses
The model of our silicon neuronal network is composed of the Digital Spiking Silicon Neuron (DSSN) model(12) and a silicon synapse model proposed in our previous work(13)
Summary
Sensory experiences are thought to be configuring the nerve system. The synaptic plasticity underlies this phenomenon, which refers to the change of the connection strength between two neurons. We proposed a silicon neuronal network implemented in an FPGA device It is composed of silicon neurons and synapses that are optimized for implementation by digital arithmetical circuits and capable of real-time operation in entry-level FGPA devices. The model of these silicon neurons was designed in the viewpoint of nonlinear dynamics and can reproduce the graded spike response in the Class II neurons in the Hodgkin’s classification(11).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.