Abstract

The development of brain-inspired spiking neural networks (SNNs) has great potential for neuromorphic edge computing applications, while challenges remain in optimizing power-efficiency and silicon utilization. Neurons, synapses and spike-based learning algorithms form the fundamental information processing mechanism of SNNs. In an effort to achieve compact and biologically plausible SNNs while restricting power consumption, we propose a set of new neuromorphic building circuits, including an analog Leaky Integrate-and-Fire (LIF) neuron circuit, configurable synapse circuits and Spike Driven Synaptic Plasticity (SDSP) learning algorithm circuits. Specifically, we explore methods to minimize large leakage current and device mismatch effects, and optimize the design of these neuromorphic circuits to enable low-power operation. A reconfigurable mixed-signal SNN is proposed based on the building circuits, allowing flexible configuration of synapse weights and attributes, resulting in enhanced SNN functionality and reduced unnecessary power consumption. This SNN chip is fabricated using 55 nm CMOS technology, and test results indicate that the proposed circuits have the ability to closely mimic the behaviors of LIF neurons, synapses and SDSP mechanisms. By configuring synaptic arrays, we established varied connections between neurons in the SNN and demonstrated that this SNN chip can implement Pavlov’s dog associative learning and binary classification tasks, while dissipating less energy per spike of the order of Pico Joules per spike at a firing rate ranging from 30 Hz to 1 kHz. The proposed circuits can be used as building blocks for constructing large-scale SNNs in neuromorphic processors.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call