Abstract

We introduce a means of harnessing spiking neural networks (SNNs) with rich dynamics as a dynamic hypothesis to learn complex sequences. The proposed SNN is referred to as nth order sequence-predicting SNN (n-SPSNN), which is capable of single-step prediction and sequence-to-sequence prediction, i.e., associative recall. As a key to these capabilities, we propose a new learning algorithm, named the learning by backpropagating action potential (LbAP) algorithm, which features (i) postsynaptic event-driven learning, (ii) access to topological and temporal local data only, (iii) competition-induced weight normalization effect, and (iv) fast learning. Most importantly, the LbAP algorithm offers a unified learning framework over the entire SPSNN based on local data only. The learning capacity of the SPSNN is mainly dictated by the number of hidden neurons h; its prediction accuracy reaches its maximum value (~1) when the hidden neuron number h is larger than twice training sequence length l, i.e., h ≥ 2l. Another advantage is its high tolerance to errors in input encoding compared to the state-of-the-art sequence learning networks, namely long short-term memory (LSTM) and gated recurrent unit (GRU). Additionally, its efficiency in learning is approximately 100 times that of LSTM and GRU when measured in terms of the number of synaptic operations until successful training, which corresponds to multiply-accumulate operations for LSTM and GRU. This high efficiency arises from the higher learning rate of the SPSNN, which is attributed to the LbAP algorithm. The code is available on-line (https://github.com/galactico7/SPSNN).

Highlights

  • Spiking neural network (SNN) is a dynamic hypothesis with diverse temporal kernels to express neuronal behaviors in response to synaptic transmission [1]–[3]

  • We propose an SNN architecture for temporal sequence learning, named nth order sequence-predicting spiking neural network (n-SPSNN)

  • To train the nth order sequencepredicting SNN (n-SPSNN), we propose an event-driven learning algorithm of locality, referred to as learning by backpropagation action potential (LbAP) algorithm

Read more

Summary

INTRODUCTION

Spiking neural network (SNN) is a dynamic hypothesis with diverse temporal kernels to express neuronal behaviors in response to synaptic transmission [1]–[3]. The effort to realize an SNN using integrated circuits—which has continued over the last three decades—paves the way for the data- and energy-efficient acceleration of deep learning. There is a lack of both SNN architecture for learning time-series data as well as a learning algorithm for the architecture, which performs sequenceprediction tasks, e.g., single-step prediction and sequence-tosequence prediction ( known as associative recall), with accuracy comparable to that of LSTM and GRU. In this regard, we propose an SNN architecture for temporal sequence learning, named nth order sequence-predicting spiking neural network (n-SPSNN). Section IV.D addresses the learning efficiency of n-SPSNN in terms of learning speed and number of synaptic operations (SynOps) until the completion of learning

RELATED WORK
SEQUENCE PREDICTION PRINCIPLE AND NETWORK ARCHITECTURE
RESULTS
CONCLUSION
TRAINING RNN WITH LSTM AND GRU LAYER
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.