Abstract

Convolutional Neural Networks (CNNs) perform well compared to other deep learning models in image recognition, especially in handwritten alphabetic numeral datasets. CNN's challenging task is to find an architecture with the right hyperparameters. Usually, this activity is done by trial and error. A genetic algorithm (GA) has been widely used for automatic hyperparameter optimization. However, the original GA with fixed chromosome length allows for suboptimal solution results because CNN has a variable number of hyperparameters depending on the depth of the model. Previous work proposed variable chromosome lengths to overcome the drawbacks of native GA. This paper proposes a variable length GA by adding global hyperparameters, namely optimizer and learning speed, to systematically and automatically tune CNN hyperparameters to improve performance. We optimize seven hyperparameters, such as the learning rate. Optimizer, kernel, filter, activation function, number of layers and pooling. The experimental results show that a population of 25 produces the best fitness value and average fitness. In addition, the comparison results show that the proposed model is superior to the basic model based on accuracy. The experimental results show that the proposed model is about 99.18% higher than the baseline model.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.