Abstract

The learning of temporal sequences is an extremely important component of human and animal behaviour. As well as the motor control involved in routine behaviour such as walking, running, talking, tool use and so on, humans have an apparently remarkable capacity for learning (and subsequently reproducing) temporal sequences. A new connectionist model of temporal sequence learning is described which is based on recurrent self-organising maps. The model is shown to be both powerful and robust, and to exhibit a strong generalisation effect not found in simple recurrent networks (SRN). The model combines two important developments in artificial neural networks; recursion and self-organising maps (SOM). Both are found in the primate cortex; topological maps appear to be ubiquitous in the cerebral cortex of higher animals, especially in the primary sensory areas, and the neuroanatomy of the cortex also reveals numerous and consistent recurrent linkages between regions.KeywordsArtificial Neural NetworkTemporal SequenceOutput VectorHebbian LearningBradford BookThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call