Abstract

Spiking Neural Networks (SNNs) are considered as the third generation of artificial neural networks, which are more closely with information processing in biological brains. However, it is still a challenge for how to train the non-differential SNN efficiently and robustly with the form of spikes. Here we give an alternative method to train SNNs by biologically-plausible structural and functional inspirations from the brain. Firstly, inspired by the significant top-down structural connections, a global random feedback alignment is designed to help the SNN propagate the error target from the output layer directly to the previous few layers. Then inspired by the local plasticity of the biological system in which the synapses are more tuned by the neighborhood neurons, a differential STDP is used to optimize local plasticity. Extensive experimental results on the benchmark MNIST (98.62%) and Fashion MNIST (89.05%) have shown that the proposed algorithm performs favorably against several state-of-the-art SNNs trained with backpropagation.

Highlights

  • Deep neural networks (DNNs) have been advancing the state-of-the-art performance in many domain-specific tasks, such as image classification (He et al, 2016), visual object tracking (Danelljan et al, 2015), visual object segmentation (Chen et al, 2017), etc

  • Spiking Time Dependent Plasticity (STDP) can be seen as the leading learning rule in the brain, and it can simulate the expected change of synaptic weights depending on states between pre-synaptic and post-synaptic (Bi and Poo, 1998), which can be regarded as a local learning rule

  • We propose an Spiking Neural Networks (SNNs) training method, which takes full advantage of the global and local plasticity information

Read more

Summary

INTRODUCTION

Deep neural networks (DNNs) have been advancing the state-of-the-art performance in many domain-specific tasks, such as image classification (He et al, 2016), visual object tracking (Danelljan et al, 2015), visual object segmentation (Chen et al, 2017), etc. The structural connections (e.g., long-term feedback loops in the cortex) and functional plasticity (e.g., neighborhood plasticity based on discrete spikes) are carefully designed by the million years of evolution in the biological brain. This phenomenon has lead to the research of biologically plausible Spiking Neural Networks (SNNs). We proposed an SNN training method that combines global feedback connections and local differential STDP learning rule and performs favorably against several existing state-of-the-art methods. It provides an alternative method for training deeper SNNs. Extensive experimental results on different datasets indicated that the proposed algorithm could significantly improve the learning ability of SNNs

BACKGROUND
Biologically Plausible Methods in ANNs
Spiking Neural Networks
METHODS
The Basic LIF Neuron Model
The Global Plasticity Learning Process of Our Model
The Local Learning Process of Our Model
The Whole Learning Framework
EXPERIMENTS
Fashion MNIST
Ablation Studies
Comparison With Other Traditional SNNs Trained With STDP
CONCLUSION AND FUTURE WORK
Findings
DATA AVAILABILITY STATEMENT
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call