Abstract

Algorithms for black-box optimization need considering numerous properties of objective functions in advance. The covariance matrix adaptation evolution strategy (CMA-ES) is known as one of the state-of-the-art algorithms for black-box optimization. Despite its achievement, the CMA-ES fails to minimize the objective function which is high-dimensional and ill-conditioned such as 100,000-dimensional Ellipsoid function. This fact is a serious problem to apply the CMA-ES to recent high-dimensional machine learning models. We confirm that the single step-size for all coordinates is one of the hindrances to the adaptation of the variance-covariance matrix. To solve this, we propose a CMA-ES with coordinate selection. Coordinate selection enables us to vectorize the step-size and adapt each component of the vector to the scale of selected coordinates. Furthermore, coordinate selection based on estimated curvature reduces the condition number during updating variables in selected coordinate space. Our method is enough simple to easily apply to most of variations of CMA-ES: only execute conventional algorithms in the selected coordinate space. The experimental results show that our method applied to the CMA-ES, the sep-CMA-ES and the VD-CMA outperforms the conventional variations of CMA-ES in terms of function evaluations and an objective value in the optimization of high-dimensional and ill-conditioned functions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call