Abstract

A memory capacity exists for artifi cial neural net- works of associative memory. The addition of new memo- ries beyond the capacity overloads the network system and makes all learned memories irretrievable (catastrophic for- getting) unless there is a provision for forgetting old memo- ries. This article describes a property of associative memory networks in which a number of units are replaced when networks learn. In our network, every time the network learns a new item or pattern, a number of units are erased and the same number of units are added. It is shown that the memory capacity of the network depends on the number of replaced units, and that there exists a optimal number of replaced units in which the memory capacity is maximized. The optimal number of replaced units is small, and seems to be independent of the network size.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call