Abstract

The promise of neuromorphic computing to develop ultra-low-power intelligent devices lies in its ability to localize information processing and memory storage in synaptic circuits much like the synapses in the brain. Spiking neural networks modeled using high-resolution synapses and armed with local unsupervised learning rules like spike time-dependent plasticity (STDP) have shown promising results in tasks such as pattern detection and image classification. However, designing and implementing a conventional, multibit STDP circuit becomes complex both in terms of the circuitry and the required silicon area. In this work, we introduce a modified and hardware-friendly STDP learning (named adaptive STDP) implemented using just 4-bit synapses. We demonstrate the capability of this learning rule in a pattern recognition task, in which a neuron learns to recognize a specific spike pattern embedded within noisy inhomogeneous Poisson spikes. Our results demonstrate that the performance of the proposed learning rule (94% using just 4-bit synapses) is similar to the conventional STDP learning (96% using 64-bit floating-point precision). The models used in this study are ideal ones for a CMOS neuromorphic circuit with analog soma and synapse circuits and mixed-signal learning circuits. The learning circuit stores the synaptic weight in a 4-bit digital memory that is updated asynchronously. In circuit simulation with Taiwan Semiconductor Manufacturing Company (TSMC) 250 nm CMOS process design kit (PDK), the static power consumption of a single synapse and the energy per spike (to generate a synaptic current of amplitude 15 pA and time constant 3 ms) are less than 2 pW and 200 fJ, respectively. The static power consumption of the learning circuit is less than 135 pW, and the energy to process a pair of pre- and postsynaptic spikes corresponding to a single learning step is less than 235 pJ. A single 4-bit synapse (capable of being configured as excitatory, inhibitory, or shunting inhibitory) along with its learning circuitry and digital memory occupies around 17,250 μm2 of silicon area.

Highlights

  • The primary goal of neuromorphic computing since its inception in the late 1980s has been to design low-power electronic circuits that can mimic human cognition as well as shed light on the complex mechanisms underlying neural computation

  • An Adaptive Spike time-dependent plasticity (STDP) Learning Rule the brain, the common feature in all the approaches is the core components of a neuromorphic system consisting of neuronal soma and synaptic circuits empowered with learning mechanisms—supervised or unsupervised

  • Spike time-dependent plasticity (STDP) is the most well-known learning rule for unsupervised learning in the brain, which is implemented in many neuromorphic systems

Read more

Summary

Introduction

The primary goal of neuromorphic computing since its inception in the late 1980s has been to design low-power electronic circuits that can mimic human cognition as well as shed light on the complex mechanisms underlying neural computation. An Adaptive STDP Learning Rule the brain, the common feature in all the approaches is the core components of a neuromorphic system consisting of neuronal soma and synaptic circuits empowered with learning mechanisms—supervised or unsupervised. These circuits are designed based on the mathematical models of the cell membrane, ionic dynamics in the cell, and synapses, which in turn are constructed from electrophysiological data measured from them. STDP-based unsupervised learning has been successful in tasks such as pattern detection (Masquelier et al, 2008, 2009) and image classification (Diehl and Cook, 2015), achieving high performance in simulation

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call