Abstract

Brain-inspired computing is an emerging field, which aims to reach brainlike performance in real-time processing of sensory data. The challenges that need to be addressed toward reaching such a computational system include building a compact massively parallel architecture with scalable interconnection devices, ultralow-power consumption, and robust neuromorphic computational schemes for implementation of learning in hardware. In this paper, we discuss programming strategies, material characteristics, and spike schemes, which enable implementation of symmetric and asymmetric synaptic plasticity with devices using phase-change materials. We demonstrate that energy consumption can be optimized by tuning the device operation regime and the spike scheme. Our simulations illustrate that a crossbar array consisting of synaptic devices and neurons can achieve hippocampus-like associative learning with symmetric synapses and sequence learning with asymmetric synapses. Pattern completion for patterns with 50% missing elements is achieved via associative learning with symmetric plasticity. Robustness of learning against input noise, variation in sensory data, and device resistance variation are investigated through simulations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.