Abstract
In-memory computing with emerging non-volatile memory devices (eNVMs) has shown promising results in accelerating matrix-vector multiplications. However, activation function calculations are still being implemented with general processors or large and complex neuron peripheral circuits. Here, we present the integration of Ag-based conductive bridge random access memory (Ag-CBRAM) crossbar arrays with Mott rectified linear unit (ReLU) activation neurons for scalable, energy and area-efficient hardware (HW) implementation of deep neural networks. We develop Ag-CBRAM devices that can achieve a high ON/OFF ratio and multi-level programmability. Compact and energy-efficient Mott ReLU neuron devices implementing ReLU activation function are directly connected to the columns of Ag-CBRAM crossbars to compute the output from the weighted sum current. We implement convolution filters and activations for VGG-16 using our integrated HW and demonstrate the successful generation of feature maps for CIFAR-10 images in HW. Our approach paves a new way toward building a highly compact and energy-efficient eNVMs-based in-memory computing system.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.