Abstract
We consider a random synaptic pruning in an initially highly interconnected network. It is proved that a random network can maintain a self-sustained activity level for some parameters. For such a set of parameters a pruning is constructed so that in the resulting network each neuron/node has almost equal numbers of in- and out-connections. It is also shown that the set of parameters which admits a self-sustained activity level is rather small within the whole space of possible parameters. It is pointed out here that the threshold of connectivity for an auto-associative memory in a Hopfield model on a random graph coincides with the threshold for the bootstrap percolation on the same random graph. It is argued that this coincidence reflects the relations between the auto-associative memory mechanism and the properties of the underlying random network structure.This article is part of a Special Issue entitled “Neural Coding".
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.