Abstract

A multilevel cell (MLC) memristor that provides high-density on-chip memory has become a promising solution for energy-efficient artificial neural networks (ANNs). However, MLC storage that stores multiple bits per cell is prone to device variation. In this paper, the device variation tolerance of ANN training is investigated based on our cell-specific variation modeling method, which focuses on characterizing realistic cell-level variation. The parameters of cycle-to-cycle variation (CCV) and device-to-device variation (DDV) are extracted separately from the experimental data of a 39-nm, 1-Gb phase-change random access memory (PCRAM) array. A quantized neural network designed for low bit-width (≥6-bit) training is used for simulations to demonstrate the potential of MLC storage. Our results demonstrate that training is more vulnerable to DDV than CCV, and CCV can even compensate for accuracy degradation caused by severe DDV. As a result, for a multilayer perceptron (MLP) on Modified National Institute of Standards and Technology (MNIST) database, 95% accuracy can be achieved with three MLC PCRAM devices per weight, which is a 40% reduction in the number of cells compared with using conventional single-level cells (SLCs). If the size of DDV is reduced by half, then only two cells, that is 60% fewer cells than using SLC, are needed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call