Abstract

Memristor crossbar arrays carry out multiply-add operations in parallel in the analog domain, and so can enable neuromorphic systems with high throughput at low energy and area consumption. On-chip training of these systems have the significant advantage of being able to get around device variability and faults. This paper presents on-chip training circuits for multi-layer neural networks implemented using a single crossbar per layer and two memristors per synapse. Using two memristors per synapse provides double the synaptic weight precision when compared to a design that uses only one memristor per synapse. Proposed on-chip training system utilizes the back propagation (BP) algorithm for synaptic weight update. Due to the use of two memristors per synapse, we utilize a novel technique for error back propagation. We evaluated the training of the system with some nonlinearly separable datasets through detailed SPICE simulations which take crossbar wire resistance and sneak-paths into consideration. Our results show that in the proposed design, the crossbars consume about 9× less power than single memristor per synapse design.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call