Abstract

Various optimization methods and network architectures are used by convolutional neural networks (CNNs). Each optimization method and network architecture style have their own advantages and representation abilities. To make the most of these advantages, evolutionary-fuzzy-integral-based convolutional neural networks (EFI-CNNs) are proposed in this paper. The proposed EFI-CNNs were verified by way of face classification of age and gender. The trained CNNs’ outputs were set as inputs of a fuzzy integral. The classification results were operated using either Sugeno or Choquet output rules. The conventional fuzzy density values of the fuzzy integral were decided by heuristic experiments. In this paper, particle swarm optimization (PSO) was used to adaptively find optimal fuzzy density values. To combine the advantages of each CNN type, the evaluation of each CNN type in EFI-CNNs is necessary. Three CNN structures, AlexNet, very deep convolutional neural network (VGG16), and GoogLeNet, and three databases, computational intelligence application laboratory (CIA), Morph, and cross-age celebrity dataset (CACD2000), were used in experiments to classify age and gender. The experimental results show that the proposed method achieved 5.95% and 3.1% higher accuracy, respectively, in classifying age and gender.

Highlights

  • Image recognition technology has continued to develop due to deep learning technology.The origins of convolutional neural networks (CNNs) can be traced back to 1998

  • We utilized EFI-CNNs for age and gender classification of faces from the computational intelligence application laboratory (CIA), Morph, and CACD2000 databases based on fuzzy integral theory

  • CNNs were used as input to the EFI, and the final fuzzy integral rule was chosen from either Sugeno or Choquet during testing

Read more

Summary

Introduction

Image recognition technology has continued to develop due to deep learning technology.The origins of convolutional neural networks (CNNs) can be traced back to 1998. Image recognition technology has continued to develop due to deep learning technology. Proposed the LeNet-5 model and used the back propagation (BP) algorithm to adjust the parameters of the neural networks; this model is a successful convolutional neural network even today. A deeper network architecture, AlexNet [2], was proposed by Alex Krizhevsky and opened up the development of deep learning. AlexNet contains 60 million parameters and uses rectified linear unit (ReLU) as the activation function, which is different from LeNet, and it adds dropout, avoiding model overfitting. GoogLeNet [4] improved the MLP convolutional layer by using a 1 × 1 convolutional kernel which achieves cross-channel message exchange and reduces dimensions. Due to the degradation problem, these networks cannot learn the features in deeper layer networks. In order to solve this problem, ResNet [5] was proposed to Electronics 2019, 8, 997; doi:10.3390/electronics8090997 www.mdpi.com/journal/electronics

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.