Abstract

Spiking Neural Networks (SNNs) enable the execution of deep learning-compatible tasks and approximation algorithms with low latency and low power consumption by operating on a neuromorphic system. Adopting Analog In-Memory Computing (AiMC) in a neuromorphic system can build a system that has an advantage in memory density over a pure digital implementation. However, sensing the AiMC output with simple circuitry inevitably leads to unintended nonlinearities. In this study, we designed a neuromorphic circuit using memcapacitive AiMC synapses, will have ultra-low power. We combine circuit-nonlinearity aware training (CNAT) with network compression techniques to prevent the SNN from losing accuracy caused by the neuron circuit’s nonlinearity and the synapse’s low resolution. The training runs on a machine learning framework and does not need to incorporate computationally intensive SPICE simulations. As simulated, our circuit performs MNIST classifications with almost no loss from ideal accuracy (97.64 %) and consumes 15.7nJ per inference.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call