Abstract

Deep-learning convolutional neural networks (CNNs) have proven to be successful in various cognitive applications with a multilayer structure. The high computational energy and time requirements hinder the practical application of CNNs; hence, the realization of a highly energy-efficient and fast-learning neural network has aroused interest. In this work, we address the computing-resource-saving problem by developing a deep model, termed the Gabor convolutional neural network (Gabor CNN), which incorporates highly expression-efficient Gabor kernels into CNNs. In order to effectively imitate the structural characteristics of traditional weight kernels, we improve upon the traditional Gabor filters, having stronger frequency and orientation representations. In addition, we propose a procedure to train Gabor CNNs, termed the fast training method (FTM). In FTM, we design a new training method based on the multipopulation genetic algorithm (MPGA) and evaluation structure to optimize improved Gabor kernels, but train the rest of the Gabor CNN parameters with back-propagation. The training of improved Gabor kernels with MPGA is much more energy-efficient with less samples and iterations. Simple tasks, like character recognition on the Mixed National Institute of Standards and Technology database (MNIST), traffic sign recognition on the German Traffic Sign Recognition Benchmark (GTSRB), and face detection on the Olivetti Research Laboratory database (ORL), are implemented using LeNet architecture. The experimental result of the Gabor CNN and MPGA training method shows a 17–19% reduction in computational energy and time and an 18–21% reduction in storage requirements with a less than 1% accuracy decrease. We eliminated a significant fraction of the computation-hungry components in the training process by incorporating highly expression-efficient Gabor kernels into CNNs.

Highlights

  • Deep learning [1,2] has been used in a variety of detection [3,4,5], classification [6], and inference tasks [7,8]

  • The experimental result of the Gabor convolutional neural networks (CNNs) and multipopulation genetic algorithm (MPGA) training method shows a 17–19% reduction in computational energy and time and an 18–21% reduction in storage requirements with a less than 1% accuracy decrease

  • The large-scale structure and training complexity of convolutional neural networks (CNNs) necessitate the most computationally intensive workloads across all modern computing platforms [10], so the implementation of energy-efficient kernels in neural networks is of interest

Read more

Summary

Introduction

Deep learning [1,2] has been used in a variety of detection [3,4,5], classification [6], and inference tasks [7,8]. The huge amounts of computational energy and time required for regular trainable weight kernel learning hinders their extensive practical application. The large-scale structure and training complexity of convolutional neural networks (CNNs) necessitate the most computationally intensive workloads across all modern computing platforms [10], so the implementation of energy-efficient kernels in neural networks is of interest. Another focuses on reducing the training complexity of a CNN [15,16] 22 of of 18 focuses on reducing the training complexity of a CNN [15,16] The latter is an important challenge challenge for convolutional as high computational energy and time are needed. CNN model, as we convolutional kernels for each visualizing a pretrained as shown shown in 1

The visualization
Gabor Filters
Convolutional Neural Network
Combination of Gabor Filters and CNNs
Overview of Our Method
Improved
MPGA Optimization for Gabor Convolutional Kernels
Method for for Gabor
Training Method for Gabor
Implementation and Experiment
Energy Efficiency and Performance
Accuracy
Training
Storage Requirement Comparison
Effects of Iterations and Sampling
Findings
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.