Abstract

AbstractThe standard Hopfield model is generalized to the case when input patterns are provided with weights that are proportional to the frequencies of patterns occurrence at the learning process. The main equation is derived by methods of statistical physics, and is solved for an arbitrary distribution of weights. An infinitely large number of input patterns can be written down in connection matrix however the memory of the network will consist of patterns whose weights exceed a critical value. The approach eliminates the catastrophic destruction of the memory characteristic to the standard Hopfield model.KeywordsHopfield modelcatastrophic forgettingweighted patterns

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call