Abstract

A self-organizing neural network that learns and recalls multiple sequences is presented. Each of the sequences may have recurring items, and several sequences may have one or more common items. The self-organizing temporal network stores each sequence independently of other sequences, and the learning of a sequence does not necessitate a retraining of the network on previously learnt sequences. An experimental assessment using a benchmark set of sequences suggests that the network recalls stored sequences in their intact form when presented with their sequence identity vectors or when presented with their constituent subsequences. The utility of the proposed temporal network, which also exhibits multimodality, is demonstrated by incorporating the network into a gated multinet model of early child language development.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call