Abstract

Most neural networks need to predefine the network architecture empirically, which may cause over-fitting or under-fitting. Besides, a large number of parameters in a fully connected network leads to the prohibitively expensive computational cost and storage overhead, which makes the model hard to be deployed on mobile devices. Dynamically optimizing the network architecture by pruning unused synapses is a promising technique for solving this problem. Most existing pruning methods focus on reducing the redundancy of deep convolutional neural networks by pruning unimportant filters or weights, at the cost of accuracy drop. In this paper, we propose an effective brain-inspired synaptic pruning method to dynamically modulate the network architecture and simultaneously improve network performance. The proposed model is biologically inspired as it dynamically eliminates redundant connections based on the synaptic pruning rules used during the brain's development. Connections are pruned if they are not activated or less activated multiple times consecutively. Extensive experiments demonstrate the effectiveness of our method on classification tasks of different complexity with the MNIST, Fashion MNIST, and CIFAR-10 datasets. Experimental results reveal that even for a compact network, the proposed method can also remove up to 59–90% of the connections, with relative improvement in learning speed and accuracy.

Highlights

  • Deep Neural Network (DNNs) have achieved state-of-the-art performance for various machine learning tasks, including image classification (Krizhevsky et al, 2012; He et al, 2015; Simonyan and Zisserman, 2015), face recognition (Lawrence et al, 1997), video prediction (Deng et al, 2013), and speech recognition (Hinton et al, 2012; Abdel-Hamid et al, 2014). In spite of their superior performance, the complex network architectures lead to a significant increase in the computation and parameter storage costs, which limits their deployment on resource-constrained devices

  • We propose a brain-inspired synaptic pruning (BSP) algorithm based on the synaptic pruning mechanism in the human brain

  • We evaluate our method on different tasks, including different datasets, training samples with different complexities, and Algorithm 1 : The BSP algorithm

Read more

Summary

Introduction

Deep Neural Network (DNNs) have achieved state-of-the-art performance for various machine learning tasks, including image classification (Krizhevsky et al, 2012; He et al, 2015; Simonyan and Zisserman, 2015), face recognition (Lawrence et al, 1997), video prediction (Deng et al, 2013), and speech recognition (Hinton et al, 2012; Abdel-Hamid et al, 2014) In spite of their superior performance, the complex network architectures lead to a significant increase in the computation and parameter storage costs, which limits their deployment on resource-constrained devices. The thresholds need to be carefully defined for different conditions

Objectives
Methods
Results
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call