Abstract

In recent years, convolutional neural networks have become increasingly important in the field of machine learning, especially for computer vision. However, deep network models are difficult to deploy on hardware-constrained devices because of huge number of parameters, storage requirements and computational cost. This paper proposes a pruning feature maps method to delete redundant feature information in the deep network. Thus, it can simplify network structure at the same time reduce the computational complexity and speed up the operation. We first define a small chi-square supervised set. The feature maps of this set and the training set are extracted. Then, two variance matrices are constructed. The differences between the variance matrices are then used to establish chi-square distances. Through continuous experiments, the optimal position threshold is set, and the feature maps corresponding to the channels below the position threshold are pruned. Experiment performances show that the method can reduce network redundancy, storage space and network complexity totally up to 70%. This is accomplished without appreciably diminishing the network’s accuracy. In the worst case, the accuracy difference for images classification by using simplified network and network before simplification was less than 0.4%. The use of a small pre-trained network also speeds up the network training. Together, these improvements constitute an important step towards the effective implementation of CNNs on constrained devices.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.