Abstract

Neuromorphic computation-in-memory fabric based on emerging non-volatile memories (NVM) is considered an attractive option to accelerate neural networks (NNs) in hardware as they provide high-performance, low-power, and reduced data movement. Although NVMs offer many benefits, they are susceptible to data retention faults, where previously stored data is not retained after a certain amount of time due to external influence. These faults are more likely to happen unidirectional and severely impact the inference accuracy of the hardware implementation of NNs since the synaptic weights stored in the NVMs are subject to retention faults. In this work, we propose an approximate scrubbing technique for NVMbased neuromorphic fabric to mitigate uni-directional retention faults with virtually zero storage overhead depending on the definition of scrub area for multilayer perceptron (MLP) and convolutional neural networks (CNNs). The training of the NNs is adjusted accordingly to meet the requirements of the proposed approximate scrubbing scheme. On different benchmarks, the proposed scrubbing approach can improve the inference accuracy up to 85.51 device operational time with negligible storage overhead.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call