Abstract

In the biological neural network, synaptic connections and their modification by Hebbian forms of associative learning have been shown in recent years to have quite complex dynamic characteristics. As yet, these dynamic forms of connection and learning have had little impact on the design of computational neural networks. It is clear however that for the processing of various forms of information, in which the temporal nature of the data is important, eg in temporal sequence learning and in contextual learning, such dynamic characteristics may play an important role. In this paper we review the neuroscientific evidence for the dynamic characteristics of learning and memory, and propose a novel computational associative learning rule which takes account of this evidence. We show that the application of this learning rule allows us to mimic in a computationally simple way certain characteristics of the biological learning process. In particular we show that the learning rule displays similar temporal asymmetry effects which result in either long term potentiation or depression in the biological synapse.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call