Abstract

Hardware-based spiking neural networks (SNNs) to mimic biological neurons have been reported. However, conventional neuron circuits in SNNs have a large area and high power consumption. In this work, a split-gate floating-body positive feedback (PF) device with a charge trapping capability is proposed as a new neuron device that imitates the integrate-and-fire function. Because of the PF characteristic, the subthreshold swing (SS) of the device is less than 0.04 mV/dec. The super-steep SS of the device leads to a low energy consumption of ∼0.25 pJ/spike for a neuron circuit (PF neuron) with the PF device, which is ∼100 times smaller than that of a conventional neuron circuit. The charge storage properties of the device mimic the integrate function of biological neurons without a large membrane capacitor, reducing the PF neuron area by about 17 times compared to that of a conventional neuron. We demonstrate the successful operation of a dense multiple PF neuron system with reset and lateral inhibition using a common self-controller in a neuron layer through simulation. With the multiple PF neuron system and the synapse array, on-line unsupervised pattern learning and recognition are successfully performed to demonstrate the feasibility of our PF device in a neural network.

Highlights

  • Conventional computing systems based on von-Neumann architecture suffer from an energy efficiency problem compared to biological brains in processing the complex data and information (Cantley et al, 2011; Indiveri et al, 2011; Yu et al, 2011; Park et al, 2013)

  • In the mixed-mode simulations, the positive feedback (PF) device was simulated by a device simulator, and the several complementary metal-oxide semiconductor (CMOS) FETs were simulated by a circuit simulator

  • The multiple neuron system including the PF neuron circuits, which have several CMOS FETs and an equivalent circuit reflecting the electrical behavior of the fabricated PF device, was simulated using a circuit simulator (HSPICE, Synopsys) with Predictive Technology Models (PTMs)

Read more

Summary

Introduction

Conventional computing systems based on von-Neumann architecture suffer from an energy efficiency problem compared to biological brains in processing the complex data and information (Cantley et al, 2011; Indiveri et al, 2011; Yu et al, 2011; Park et al, 2013). As an alternative to conventional computing architectures, neuromorphic computing architectures have been studied to enable complex processes, such as pattern recognition, classification, and perception (Wijekoon and Dudek, 2008; Ghosh-Dastidar and Adeli, 2009; Basu et al, 2013; Rajendran et al, 2013; Kasabov, 2014; Eryilmaz et al, 2015) Among these architectures, deep neural networks (DNNs) such as deep-belief network (DBN) and convolutional network (ConvNet), which are computing architectures that use a mathematical algorithm model, have been widely reported to reduce the computing energy by mimicking the parallel computation of biological brains The memristor is compatible with the CMOS process, but it has reliability issues when manufactured in a nanoscale and constructed as multiple array layers (Pouyan et al, 2014)

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call