Abstract

With advances in artificial intelligent services, brain-inspired neuromorphic systems with synaptic devices have recently attracted significant interest to circumvent the von Neumann bottleneck. However, the increasing trend of deep neural network parameters causes huge power consumption and large area overhead of a nonlinear neuron electronic circuit, and it incurs a vanishing gradient problem. Here, we present a memristor-based compact and energy-efficient neuron device to implement a rectifying linear unit (ReLU) activation function. To emulate the volatile and gradual switching of the ReLU function, we propose a copolymer memristor with a hybrid structure using a copolymer/inorganic bilayer. The functional copolymer film developed by introducing imidazole functional groups enables the formation of nanocluster-type pseudo-conductive filaments by boosting the nucleation of Cu nanoclusters, causing gradual switching. The ReLU neuron device is successfully demonstrated by integrating the memristor with an amorphous InGaZnO thin-film transistors, and achieves 0.5 pJ of energy consumption based on sub-10μA operation current and high-speed switching of 650ns. Furthermore, device-to-system-level simulation using neuron devices on the MNIST dataset demonstrates that the vanishing gradient problem is effectively resolved by five-layer deep neural networks. The proposed neuron device will enable the implementation of high-density and energy-efficient hardware neuromorphic systems. This article is protected by copyright. All rights reserved.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call