Abstract

Implementations of artificial neural networks as analog VLSI circuits differ in their method of synaptic weight storage (digital weights, analog EEPROMs, or capacitive weights) and in whether learning is performed locally at the synapses or off-chip. In this paper, we explain the principles of analog networks with in situ or local synaptic learning of capacitive weights, with test results of CMOS implementations from our laboratory. Synapses for both simple Hebbian and mean field networks are investigated. Synaptic weights may be refreshed by periodic rehearsal on the training data, which compensates for temperature drift or other nonstationarity. Compact high-performance layouts have been obtained in which learning adjusts for component variability.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call