Abstract

AbstractArtificial neural networks with nonvolatile memristors as synapses represent an in‐memory computing paradigm, where the online training of networks with rapidly growing scale places high requirements on device uniformity and endurance as well as training time and energy. Here, a highly uniform two‐terminal Hf0.5Zr0.5O2 based artificial synapse is reported, which shows 4‐bit weight precision, cycle‐to‐cycle variation (σ/µ) of <4%, device‐to‐device variation (σ/µ) of <9%, retention of >104 s at 85 °C, and endurance of >106. The improved uniformity can be attributed to the nonfilamentary resistive switching mechanism mediated by ion exchange at the interface. A holistic optimization using such reliable synapses and modified algorithm with sparsified back propagation can accelerate training by 166 times, decrease the energy consumption by 83 times, reduce the total updating rate by 138 times in multilayer perceptron, and the approach can be extended to convolutional neural networks as well. The synergistically optimized approach thus paves the way for the construction of memristor based systems capable of learning and interacting adaptively.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call