Abstract

Neuromorphic computation-in-memory fabric based on emerging non-volatile memories (NVM) is considered an attractive option to accelerate neural networks (NNs) in hardware as they provide high-performance, low-power, and reduced data movement. Although NVMs offer many benefits, they are susceptible to data retention faults, where previously stored data is not retained. This severely impacts the inference accuracy of mapped NNs. Traditionally, memory scrubbing with error-correcting codes (ECC) is employed to mitigate retention faults in conventional CMOS memories. This is not feasible in NVM-based neuromorphic fabric due to high overhead and inability to represent encoding or decoding in analog computing. In this work, we propose an approximate scrubbing technique for NVM-based neuromorphic fabric to mitigate uni-directional retention faults with minimal storage overhead. The training of the NNs adjusted accordingly to meet the requirements of the scrubbing scheme. On different benchmarks, the proposed scrubbing approach can improve the inference accuracy up to 85.51% over the lifetime with virtually zero storage overhead.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call