Abstract

A memory capacity exists for artifi cial neural net- works of associative memory. The addition of new memo- ries beyond the capacity overloads the network system and makes all learned memories irretrievable (catastrophic for- getting) unless there is a provision for forgetting old memo- ries. This article describes a property of associative memory networks in which a number of units are replaced when networks learn. In our network, every time the network learns a new item or pattern, a number of units are erased and the same number of units are added. It is shown that the memory capacity of the network depends on the number of replaced units, and that there exists a optimal number of replaced units in which the memory capacity is maximized. The optimal number of replaced units is small, and seems to be independent of the network size.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.