Abstract

An approach to storing of temporal sequences that deals with complex temporal sequences directly is presented. Short-term memory (STM) is modeled by units comprised of recurrent excitatory connections between two neurons. A dual-neuron model is proposed. By applying the Hebbian learning rule at each synapse and a normalization rule among all synaptic weights of a neuron, it is shown that a quantity called the input potential increases monotonically with sequence presentation, and that the neuron can only be fired when its input signals are arranged in a specific sequence. These sequence-detecting neurons form the basis for a model of complex sequence recognition that can tolerate distortions of the learned sequences. A recurrent network of two layers is provided for reproducing complex sequences.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">&gt;</ETX>

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call