Abstract

Hardware-based spiking neural networks (SNNs) are regarded as promising candidates for the cognitive computing system due to low power consumption and highly parallel operation. In this work, we train the SNN in which the firing time carries information using temporal backpropagation. The temporally encoded SNN with 512 hidden neurons showed an accuracy of 96.90% for the MNIST test set. Furthermore, the effect of the device variation on the accuracy in temporally encoded SNN is investigated and compared with that of the rate-encoded network. In a hardware configuration of our SNN, NOR-type analog memory having an asymmetric floating gate is used as a synaptic device. In addition, we propose a neuron circuit including a refractory period generator for temporally encoded SNN. The performance of the 2-layer neural network consisting of synapses and proposed neurons is evaluated through circuit simulation using SPICE. The network with 128 hidden neurons showed an accuracy of 94.9%, a 0.1% reduction compared to that of the system simulation of the MNIST dataset. Finally, the latency and power consumption of each block constituting the temporal network is analyzed and compared with those of the rate-encoded network depending on the total time step. Assuming that the total time step number of the network is 256, the temporal network consumes 15.12 times lower power than the rate-encoded network and can make decisions 5.68 times faster.

Highlights

  • Artificial Neural Networks (ANNs) have recently shown remarkable results surpassing humans in certain tasks such as pattern recognition, object detection, and natural language processing [1,2,3,4,5,6,7]

  • In this study, we have evaluated the performance of the SNN consisting of NOR-type asymmetric floating gate (FG) synaptic devices and neuron circuits at the system-level and circuitlevel

  • Input data was encoded as the time of the input spikes, and the network was trained by temporal backpropagation, a learning method suitable for networks applying the TTFS encoding method

Read more

Summary

INTRODUCTION

Artificial Neural Networks (ANNs) have recently shown remarkable results surpassing humans in certain tasks such as pattern recognition, object detection, and natural language processing [1,2,3,4,5,6,7]. Software-based ANNs are far from real-time and low power processing, making computing on the edge devices is challenging In this perspective, there are many studies on neural networks based on hardware [11,12], especially SNNs using analog synaptic devices are regarded as an enormously competent network. Another candidate for the encoding method is temporal encoding, where the input data is transformed to the firing time of the input spikes [18]. We configure SNN at the circuit level, where information is carried as the firing time of a single spike by adopting the temporal encoding method.

Methods
NOR-TYPE SYNAPTIC DEVICE HAVING ASYMMETRIC FLOATING GATE
PERFORMANCE OF SNN ON MNIST
EFFECTS ON ACCURACY BY VARIATION IN HARDWARE
CIRCUIT-LEVEL SIMULATIONS AND RESULTS
10-7 Symbol : Measured Data
Findings
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call