Abstract

Unlike modern computers that use digital ‘0’ and ‘1’ for computation, neural networks in human brains exhibit analog changes in neural connections (i.e. synaptic weight) during the decision-making and learning processes. This analog nature as well as the neural network's massive parallelism are partly why human brains (~20 W) are much better at complex tasks such as pattern recognition than even the most powerful computers (~1 MW) with significantly better energy efficiency. Currently, majority of the research efforts towards developing artificial neural networks are based on digital technology with CMOS devices [1], which cannot mimic the analog behaviors of biological synapses and thus energy-extensive. Recently, emerging memory devices such as phase change memory (PCM), resistive random access memory (RRAM), and spin-torque transfer (STT) RAM [2]–[4] have been studied to mimic synaptic connections with their programmable conductance. While these approaches are promising, they still face various limitations such as poor controllability, subpar reliability, large variability, and non-symmetrical resistance response.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call