Abstract

Forgetting is an essential phenomenon in human brain to help people get away from chaos. Such phenomenon could be emulated in the electronic synaptic device, which is the critical unit for hardware implementation of Spiking Neural Network (SNN). To the best of our knowledge, however, forgetting phenomenon has rarely been applied in weights update of traditional SNNs based on spike-timing-dependent plasticity (STDP). In this work, we propose a novel SNN training algorithm using forgetting phenomenon. The weights update procedures are composed of potentiation and forgetting, and implemented by single-polarity pulses and time intervals between training samples. Benchmarked with the MNIST handwriting dataset, we demonstrate the algorithm’s performance by a single-layer perceptron with 784 × 10 synapses. Besides, the influence of some non-ideal factors are taken into consideration to analyze the robust performance of enhanced SNN. The simulation result indicates the proposed SNN with forgetting phenomenon exhibits faster convergence speed and higher recognition rate (88.07%) than traditional SNN with similar network scale. Moreover, it shows a good tolerance to the non-linear conductance response and variation while it has less endurance requirement of electronic synaptic devices.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call