Abstract

The use of convolutional neural networks involves hyperparameters optimization. Gaussian process based Bayesian optimization (GPEI) has proven to be an effective algorithm to optimize several hyperparameters. Then deep networks for global optimization algorithm (DNGO) that used neural network as an alternative to Gaussian process was proposed to optimize more hyperparameters.This paper presents a new algorithm that combines multiscale and multilevel evolutionary optimization (MSMLEO) with GPEI to optimize dozens of hyperparameters. These hyperparameters are divided into two groups. The first group related with the sizes of layers and kernels are discrete integers. The second group related with learning rates and so on is continuous floating-point numbers. All combinations of the first group are corresponding to the combinations of grid points on multi-scale grids and MSMLEO launches GPEI to optimize the second group of hyperparameters while the first group keeps fixed. The output of convolutional networks configured with above two groups of optimized hyperparameters is used as the fitness of MSMLEO. MSMLEO alternates with GPEI to search the optimal hyperparameters from coarsest scale to finest scale. Experimental results show that our algorithm has better performance and adaptability on optimizing dozens of hyperparameters of neural networks with a variety of numerical types.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call