Abstract
This paper explores a variant of Hebbian learning in which binary synapses are updated stochastically rather than deterministically. In this variant, a single potentiation or depression event is implemented by setting a synapse weight respectively to one or zero with a finite probability, if it is not this value already. This learning rule is compared to the conventional Hebbian rule where a continuously valued synapse moves a fraction towards 1.0 or 0.0. It is shown that given a set of input-output pattern pairs, the expected value of a particular synapse is the same for both learning rules. Also, as the network size and the input activity levels increase, the signal to noise ratio of the dendritic sums approaches infinity. These stochastic binary synapses are presented as a viable mechanism for the VLSI implementation of Hebbian-based neural networks.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.