Abstract

Much evidence indicates that the perirhinal cortex is involved in the familiarity discrimination aspect of recognition memory. It has been previously shown under selective conditions that neural networks performing familiarity discrimination can achieve very high storage capacity, being able to deal with many times more stimuli than associative memory networks can in associative recall. The capacity of associative memories for recall has been shown to be highly dependent on the sparseness of coding. However, previous work on the networks of Bogacz et al, Norman and O'Reilly and Sohal and Hasselmo that model familiarity discrimination in the perirhinal cortex has not investigated the effects of the sparseness of encoding on capacity. This paper explores how sparseness of coding influences the capacity of each of these published models and establishes that sparse coding influences the capacity of the different models in different ways. The capacity of the Bogacz et al model can be made independent of the sparseness of coding. Capacity increases as coding becomes sparser for a simplified version of the neocortical part of the Norman and O'Reilly model, whereas capacity decreases as coding becomes sparser for a simplified version of the Sohal and Hasselmo model. Thus in general, and in contrast to associative memory networks, sparse encoding results in little or no advantage for the capacity of familiarity discrimination networks. Hence it may be less important for coding to be sparse in the perirhinal cortex than it is in the hippocampus. Additionally, it is established that the capacities of the networks are strongly dependent on the precise form of the learning rules (synaptic plasticity) used in the network. This finding indicates that the precise characteristics of synaptic plastic changes in the real brain are likely to have major influences on storage capacity.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.