Abstract

Connectionist models of memory storage have been studied for many years, and aim to provide insight into potential mechanisms of memory storage by the brain. A problem faced by these systems is that as the number of items to be stored increases across a finite set of neurons/synapses, the cumulative changes in synaptic weight eventually lead to a sudden and dramatic loss of the stored information (catastrophic interference, CI) as the previous changes in synaptic weight are effectively lost. This effect does not occur in the brain, where information loss is gradual. Various attempts have been made to overcome the effects of CI, but these generally use schemes that impose restrictions on the system or its inputs rather than allowing the system to intrinsically cope with increasing storage demands. We show here that catastrophic interference occurs as a result of interference among patterns that lead to catastrophic effects when the number of patterns stored exceeds a critical limit. However, when Gram-Schmidt orthogonalization is combined with the Hebb-Hopfield model, the model attains the ability to eliminate CI. This approach differs from previous orthogonalisation schemes used in connectionist networks which essentially reflect sparse coding of the input. Here CI is avoided in a network of a fixed size without setting limits on the rate or number of patterns encoded, and without separating encoding and retrieval, thus offering the advantage of allowing associations between incoming and stored patterns.PACS Nos.: 87.10.+e, 87.18.Bb, 87.18.Sn, 87.19.La

Highlights

  • Nervous systems have two basic requirements: they must be stable and able to generate reliable specific outputs, while at the same time they must be flexible to allow the output to change during development or as a result of experience

  • This is the ‘‘stability-plasticity dilemma’’ [1], and it is a concern to both neurobiologists who want to understand how nervous systems cope with constantly changing internal and external conditions, and those working on artificial neural networks

  • One of the first considerations of this problem was highlighted by Bienenstock, Cooper and Munro [2], who suggested that longterm potentiation (LTP), a proposed mechanism for learning and memory [3], could suffer from an inherent instability

Read more

Summary

Introduction

Nervous systems have two basic requirements: they must be stable and able to generate reliable specific outputs, while at the same time they must be flexible to allow the output to change during development or as a result of experience. That a catastrophic interference like effect can be shown under some conditions is of interest, as it suggests a basic limitation of storage systems that use a finite ( large) number of components, and further that the brain has presumably evolved a way of avoiding this phenomenon, allowing new information to be stored without disrupting previously stored information (but see [15]) Understanding this capability of the brain and how it can be applied in artificial networks could be of interest to both the psychological/neurobiological and technological communities. In eqn(5) the first term on the right hand side is like signal while A represents noise – note that the first term is obtained by isolating in eqn (4) the relevant component, i.e. ith, of the pattern being retrieved, i.e. the nt h vector, while the overlaps of ~j(n) with all the remaining vectors in the memory store are clubbed together in the second term; it is these non-zero overlaps that obfuscate the signal and act as noise. In figure 2 we present the result of a simulation showing how degradation sets in in the quality of retrieval as p/N exceeds 0.14 (details are given )

A Way Out of Catastrophic Interference
Findings
Discussion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.