Abstract

Recent studies have shown that memristor crossbar based neuromorphic hardware enables high performance implementations of neural networks at low power and in low chip area. This paper presents circuits to train a cascaded set of memristor crossbars representing a multi-layered neural network. The circuits presented implement back-propagation training and would enable on-chip training of memristor crossbars. On-chip training of memristor crossbars can be necessary to overcome the effect of device variability and alternate current paths within crossbars being used as neural networks. We model the memristor crossbars in SPICE in order capture alternate current paths and the impact of wire resistance. Our design can be scaled to multiple neural layers and multiple output neurons. We demonstrate the training of up to three layered neural networks evaluating non-linearly separable functions through detailed SPICE simulations. This is the first study in the literature we have seen that examines the implementation of back-propagation based training of memristor crossbar circuits. The impact of this work would be to enable the design of highly energy efficient and compact neuromorphic processing systems that can be trained to implement large deep networks (such as deep belief networks).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.