Abstract

A multilevel cell (MLC) memristor that provides high-density on-chip memory has become a promising solution for energy-efficient artificial neural networks (ANNs). However, MLC storage that stores multiple bits per cell is prone to device variation. In this paper, the device variation tolerance of ANN training is investigated based on our cell-specific variation modeling method, which focuses on characterizing realistic cell-level variation. The parameters of cycle-to-cycle variation (CCV) and device-to-device variation (DDV) are extracted separately from the experimental data of a 39-nm, 1-Gb phase-change random access memory (PCRAM) array. A quantized neural network designed for low bit-width (≥6-bit) training is used for simulations to demonstrate the potential of MLC storage. Our results demonstrate that training is more vulnerable to DDV than CCV, and CCV can even compensate for accuracy degradation caused by severe DDV. As a result, for a multilayer perceptron (MLP) on Modified National Institute of Standards and Technology (MNIST) database, 95% accuracy can be achieved with three MLC PCRAM devices per weight, which is a 40% reduction in the number of cells compared with using conventional single-level cells (SLCs). If the size of DDV is reduced by half, then only two cells, that is 60% fewer cells than using SLC, are needed.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.