Abstract

Quantal synaptic failures are random, independent events in which the arrival of an action potential fails to release transmitter. Although quantal failures destroy information, there exist conditions where such failures enhance learning by a neural network model of the hippocampus. In particular, we show how the appropriate failure rate can allow the model to learn the hippocampally dependent task of transverse patterning. Usefully, since lowered activity levels produce higher memory capacity, synaptic failures lead to robust performance at lowered activity levels. Thus, the synaptic failure mechanism is another example of a random fluctuation that improves neural network computations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call