Abstract
To solve real-time challenges, neuromorphic systems generally require deep and complex network structures. Thus, it is crucial to search for effective solutions that can reduce network complexity, improve energy efficiency, and maintain high accuracy. To this end, we propose unsupervised pruning strategies that are focused on pruning neurons while training in spiking neural networks (SNNs) by utilizing network dynamics. The importance of neurons is determined by the fact that neurons that fire more spikes contribute more to network performance. Based on these criteria, we demonstrate that pruning with an adaptive spike count threshold provides a simple and effective approach that can reduce network size significantly and maintain high classification accuracy. The online adaptive pruning shows potential for developing energy-efficient training techniques due to less memory access and less weight-update computation. Furthermore, a parallel digital implementation scheme is proposed to implement spiking neural networks (SNNs) on field programmable gate array (FPGA). Notably, our proposed pruning strategies preserve the dense format of weight matrices, so the implementation architecture remains the same after network compression. The adaptive pruning strategy enables 2.3× reduction in memory size and 2.8× improvement on energy efficiency when 400 neurons are pruned from an 800-neuron network, while the loss of classification accuracy is 1.69%. And the best choice of pruning percentage depends on the trade-off among accuracy, memory, and energy. Therefore, this work offers a promising solution for effective network compression and energy-efficient hardware implementation of neuromorphic systems in real-time applications.
Highlights
IntroductionThe human brain is considered as the most complex, energy-efficient, and intelligent control system since it is responsible for supervising the functions of the body, interpreting external information, taking appropriate actions and most importantly, embodies the essence of our mind [1]
The human brain is considered as the most complex, energy-efficient, and intelligent control system since it is responsible for supervising the functions of the body, interpreting external information, taking appropriate actions and most importantly, embodies the essence of our mind [1].These facts lead researchers to embrace brain-inspired computing as a new paradigm to deal with increasingly complex computational problems
This paper proposes three different strategies for pruning neurons while training in an unsupervised spiking neural network and a parallel digital implementation scheme on field programmable gate array (FPGA)
Summary
The human brain is considered as the most complex, energy-efficient, and intelligent control system since it is responsible for supervising the functions of the body, interpreting external information, taking appropriate actions and most importantly, embodies the essence of our mind [1]. Neuron pruning takes away entire non-relevant neurons, which leads to many benefits—(1) Neuron pruning reduces the network parameters significantly where all the synapses associated with the pruned neuron are pruned as well It provides speedup and energy reduction; (2) Neuron pruning eliminates the entire rows/columns in the weight matrices reducing the weight matrices’ dimensions proportionally, which could be efficiently implemented in the hardware compared to unstructured weight pruning [9]; (3) It provides a way to determine the optimal number of neurons for a given network architecture [10]. We propose different strategies for pruning neurons during unsupervised training in SNNs and digital implementations to demonstrate significantly reduced memory size and improved energy efficiency. No additional compression technique is required for the implementation of pruned SNNs
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.