Abstract

Low-power and low-area neurons are essential for hardware implementation of large-scale SNNs. Various novel-physics-based leaky-integrate-and-fire (LIF) neuron architectures have been proposed with low power and area, but are not compatible with CMOS technology to enable brain scale implementation of SNN. In this paper, for the first time, we demonstrate hardware implementation of recurrent SNN using proposed low-power, low-area, and low-leakage band-to-band-tunneling (BTBT) based neurons. A low-power thresholding circuit is proposed. We further propose a predistortion technique to linearize a nonlinear neuron without any area and power overhead. We establish the equivalence of the proposed neuron with the ideal LIF neuron to demonstrate its versatility. The tunneling regime enables a high input impedance in the BTBT neurons (few <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$\text{G}\Omega$ </tex-math></inline-formula> ) to enable a voltage input without loading the synaptic array. To verify the effect of the proposed neuron, a 36-neuron recurrent SNN is fabricated in GF-45nm PDSOI technology. We achieved 5000x lower energy-per-spike at a similar area and 10x lower standby power at a similar area and energy-per-spike. Such overall performance improvement enables brain scale computing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call