Abstract

We introduce and analyse a minimal network model of semantic memory in the humanbrain. The model is a global associative memory structured as a collection ofN local modules, each coding a feature, which can takeS possible values, witha global sparseness a (the average fraction of features describing a concept). We show that, under optimal conditions, thenumber cM of modules connected on average to a module can range widely between very sparseconnectivity (high dilution, ) and full connectivity (), maintaining a global network storage capacity (the maximum numberpc of stored and retrievable concepts) that scales likepc∼cMS2/a, with logarithmic corrections consistent with the constraint that each synapse may storeup to a fraction of a bit.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call