Abstract

Recent neurophysiological results indicate that changes in synaptic efficacy are dependent on co-occurrence of a pre and a postsynaptic spike at the synapse [5,8]. There are only a few models of parts of the nervous system that use temporal correlation of single spikes in learning [1]. In most models of artificial neural networks neurons communicate by analog signals representing frequencies, and their learning rules are also defined on these continuous signals. Timing of single spikes is not used, nor is it represented. This simplification has proven useful in many applications and it makes simulations in software simpler and faster. Spiking systems have been avoided because they are computationally more difficult. However, by implementing spiking and learning artificial neurons in analog VLSI it is possible to examine the behavior of these more detailed models in real time. This is why ourselves and others [4] have started to use silicon models of spiking learning neurons. We have formulated one possible mechanism of weight normalization: a Hebbian learning rule that makes use of temporal correlations between single spikes. We have implemented it on a CMOS chip and demonstrate its normalizing behavior.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.