Abstract

Spiking neural network (SNN) is promising but the development has fallen far behind conventional deep neural networks (DNNs) because of difficult training. To resolve the training problem, we analyze the closed-form input-output response of spiking neurons and use the response expression to build abstract SNN models for training. This avoids calculating membrane potential during training and makes the direct training of SNN as efficient as DNN. We show that the nonleaky integrate-and-fire neuron with single-spike temporal-coding is the best choice for direct-train deep SNNs. We develop an energy-efficient phase-domain signal processing circuit for the neuron and propose a direct-train deep SNN framework. Thanks to easy training, we train deep SNNs under weight quantizations to study their robustness over low-cost neuromorphic hardware. Experiments show that our direct-train deep SNNs have the highest CIFAR-10 classification accuracy among SNNs, achieve ImageNet classification accuracy within 1% of the DNN of equivalent architecture, and are robust to weight quantization and noise perturbation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call