Abstract

Conventional neural network models for temporal association generally do not work well in the absence of synchronizing neurons. This is because their dynamical properties are fundamentally not suitable for storing sequential patterns, no matter what storage or learning algorithm is used. The present article describes a nonmonotone neural network (NNN) model in which sequential patterns are stored by being embedded in a trajectory attractor of the dynamical system, and recalled stably and smoothly without synchronization; recall is done in such a way that the network state successively moves along the trajectory. A simple and natural learning algorithm for the NNN is also presented, where one only has to vary the input pattern gradually and modify the synaptic weights according to a kind of covariance rule; then the network state follows slightly behind the input pattern, and its trajectory grows to be an attractor with a small number of repetitions. Copyright © 1996 Elsevier Science Ltd.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call