Abstract
Neuromorphic fabric based on emerging resistive non-volatile memories (NVMs) is a promising approach for the realization of Neural Networks (NNs) due to their low power consumption and latency based on in-memory computation. However, the non-idealities of NVMs shift activation distributions from training values, affecting post-mapping inference accuracy. Re-calibrating NNs’ batch normalization layer in a per-chip manner restores a shifted distribution close to the training distribution. However, re-calibration overhead is an issue not addressed in existing studies. Therefore, we propose approximate batch normalization and a test pattern generation method for efficient re-calibration. The proposed method requires only 0.2% of training data for re-calibration, but can regain 72.3% inference accuracy on various benchmark datasets and fault scenarios.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.