Abstract
Neural associative networks are a promising computational paradigm, both for modeling neural circuits of the brain and implementing Hebbian cell assemblies in parallel VLSI or nanoscale hardware. Previous works have extensively investigated synaptic learning in linear models of the Hopfield-type and simple non-linear models of the Steinbuch/Willshaw-type. For example, optimized Hopfield networks of n neurons can memorize about n2/k cell assemblies of size k (or associations between them) corresponding to a synaptic capacity of 0.72 bits per real-valued synapse. Although employing much simpler synapses much better suited for efficient hardware implementations, Willshaw networks can still store up to 0.69 bits per binary synapse. However, the number of cell assemblies is limited to about n2/k2 which becomes comparable to the Hopfield nets only for extremely small k. Here I present zip nets being an improved non-linear learning method for binary synapses that combines the advantages of the previous models. Zip nets have, up to factor 2/π ≈ 0.64, the same high storage capacity as Hopfield networks. Moreover, for low-entropy synapses (e.g., if most synapses are silent), zip nets can be compressed storing up to 1 bit per computer bit or, for synaptic pruning, up to log n bits per synapse. Similar is true for a generalized zip net model employing discrete synapses with an arbitrary number of states.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.