Abstract

Due to the distributed and asynchronous nature of neural computation through low-energy spikes, brain-inspired hardware systems offer high energy efficiency and massive parallelism. One such platform is the IBM TrueNorth neurosynaptic system. Recently, TrueNorth compatible representation learning algorithms have emerged, achieving close to the state-of-the-art performance in various data sets. However, its application in temporal sequence processing models, such as recurrent neural networks (RNNs), is still only at the proof of concept level. There is an inherent difficulty in capturing temporal dynamics of an RNN using spiking neurons, which is only exasperated by the hardware constraints in connectivity and synaptic weight resolution. This paper presents a design flow that overcomes these difficulties and maps a special case of recurrent networks called long short-term memory (LSTM) onto a spike-based platform. The framework utilizes various approximation techniques, such as activation discretization, weight quantization, and scaling and rounding, spiking neural circuits that implement the complex gating mechanisms, and a store-and-release technique to enable neuron synchronization and faithful storage. While the presented techniques can be applied to map LSTM to any spiking neural network (SNN) simulator/emulator, here we choose the TrueNorth chip as the target platform by adhering to its hardware constraints. Three LSTM applications, parity check, extended Reber grammar, and question classification, are evaluated. The tradeoffs among accuracy, performance, and energy tradeoffs achieved on TrueNorth are demonstrated. This is compared with the performance on an SNN platform without hardware constraints, which represents the upper bound of the achievable accuracy.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.