Abstract

The spiking neural network (SNN)-based neuromorphic hardware has been extensively studied in dynamic information processing. However, there is still a lack of training algorithms to drive or support the implementation of compact spatio-temporal neuromorphic hardware; and the existing neuromorphic hardware uses excessive on-chip memory to store parameters, which limits its neuron and synapse scale. Here, we introduce a hardware-friendly weight binarized spiking neural network (BSNN) to efficiently recognize the spatio-temporal event-based data. The neuron of the spike response model (SRM) is used in BSNN due to its rich spatio-temporal characteristics. In the training process, a surrogate gradient method is used to replace the derivative of the spike train, and the weights are binarized. Moreover, combined with the spiking characteristics of SNN (i.e., the input/output of SNN and the communications of neurons in SNN are binary spikes), it is possible to replace the hardware-expensive matrix–vector multiplication (MVM) with the hardware-friendly “Signed AND” operation during inference, which is favored for constructing compact neuromorphic hardware. The trained BSNN has competitive recognition accuracies of 99.52% and 62.1%, 97.57%, and 90.35% on the dynamic images dataset N-MNIST and DVS-CIFAR10, dynamic gestures dataset DvsGesture, and dynamic audio dataset N-TIDIGITS18, respectively, which are related to human vision or hearing. The proposed compact SNN training method paves the way for real-time dynamic information processing oriented hardware-saving and power-efficient neuromorphic hardware.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.