Abstract

The spike generation mechanism and information coding process of biological neurons can be emulated by the amplitude-to-frequency modulation property of delta-sigma modulators (ΔΣ). Oversampling, averaging, and noise-shaping features of the ΔΣ allow high neural coding accuracy and mitigate the intrinsic noise level in neural networks. In this paper, a ΔΣ is proposed as a neuron activation function for inference and training of artificial analog neural networks. The inherent dithering of the ΔΣ prevents the weights from being stuck in a spurious local minimum, and its nonlinear transfer function makes it attractive for multi-layer architectures. Memristive synapses are used as weights, which are trained by supervised/unsupervised machine learning (ML) algorithms, using stochastic gradient descent (SGD) or biologically plausible spike-time-dependent plasticity (STDP). Our ΔΣ networks outperform the prevalent power-hungry pulse width modulator counterparts, with 97.37% training accuracy and 3.2X speedup in MNIST using SGD. These findings constitute a milestone in closing the cultural gap between brain-inspired models and ML using analog neuromorphic hardware.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call