Abstract

The Hopfield model is a pioneering neural network model with associative memory retrieval. The analytical solution of the model in mean field limit revealed that memories can be retrieved without any error up to a finite storage capacity of O(N), where N is the system size. Beyond the threshold, they are completely lost. Since the introduction of the Hopfield model, the theory of neural networks has been further developed toward realistic neural networks using analog neurons, spiking neurons, etc. Nevertheless, those advances are based on fully connected networks, which are inconsistent with recent experimental discovery that the number of connections of each neuron seems to be heterogeneous, following a heavy-tailed distribution. Motivated by this observation, we consider the Hopfield model on scale-free networks and obtain a different pattern of associative memory retrieval from that obtained on the fully connected network: the storage capacity becomes tremendously enhanced but with some error in the memory retrieval, which appears as the heterogeneity of the connections is increased. Moreover, the error rates are also obtained on several real neural networks and are indeed similar to that on scale-free model networks.

Highlights

  • Human neuroscience has attracted increasing attention through various studies

  • Neural network models of associative memory have been used to explain how the brain stores and recalls long-term memories. These models incorporate the so-called Hebbian rule for a cell assembly, a group of excitatory neurons mutually coupled by strong synapses [3]: Memory storage occurs when a cell assembly is created by Hebbian synaptic plasticity, and memory retrieval or recall occurs when the neurons in the cell assembly are activated by a stimulus

  • We obtained the results that as the network changes from a hub-absent network to a SF network with degree exponent just above two, the storage capacity becomes tremendously enhanced, but some error occurs

Read more

Summary

Introduction

Human neuroscience has attracted increasing attention through various studies Among such activities, the retrieval or recall of associative memory in neural networks is a historically noticeable issue [1, 2]. Neural network models of associative memory have been used to explain how the brain stores and recalls long-term memories. These models incorporate the so-called Hebbian rule for a cell assembly, a group of excitatory neurons mutually coupled by strong synapses [3]: Memory storage occurs when a cell assembly is created by Hebbian synaptic plasticity, and memory retrieval or recall occurs when the neurons in the cell assembly are activated by a stimulus. Neural network models of associative memory assume that information exists alternatively as neural activity or as synaptic

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.