Abstract

Deep Artificial Neural Networks (ANNs) employ a simplified analog neuron model that mimics the rate transfer function of integrate-and-fire neurons. In Spiking Neural Networks (SNNs), the predominant information transmission method is based on rate codes. This code is inefficient from a hardware perspective because the number of transmitted spikes is proportional to the encoded analog value. Alternate codes such as temporal codes that are based on single spikes are difficult to scale up for large networks due to their sensitivity to spike timing noise. Here we present a study of an encoding scheme based on temporal spike patterns. This scheme inherits the efficiency of temporal codes but retains the robustness of rate codes. The pattern code is evaluated on MNIST, CIFAR-10, and ImageNet image classification tasks. We compare the network performance of ANNs, rate-coded SNNs, and temporal-coded SNNs, using the classification error and operation count as performance metrics. We also estimate the power consumption of the digital logic needed for the operations associated with each encoding type, and the impact of the bit precision of the weights and activations. On ImageNet, the temporal pattern code achieves up to <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$35\times$</tex> reduction in the estimated power consumption compared to the rate-coded SNN, and <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$42\times$</tex> compared to the ANN. The classification error of the pattern-coded SNN is increased by <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$&lt; {1\%}$</tex> compared to the ANN, and decreased by 2% compared to the rate-coded SNN.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.