Abstract

Covariance matrix adaptation evolution strategy (CMA-ES) is a successful gradient-free optimization algorithm. Yet, it can hardly scale to handle high-dimensional problems. In this paper, we propose a fast variant of CMA-ES (Fast CMA-ES) to handle large-scale black-box optimization problems. We approximate the covariance matrix by a low-rank matrix with a few vectors and use two of them to generate each new solution. The algorithm achieves linear internal complexity on the dimension of search space. We illustrate that the covariance matrix of the underlying distribution can be considered as an ensemble of simple models constructed by two vectors. We experimentally investigate the algorithm's behaviors and performances. It is more efficient than the CMA-ES in terms of running time. It outperforms or performs comparatively to the variant limited memory CMA-ES on large-scale problems. Finally, we evaluate the algorithm's performance with a restart strategy on the CEC'2010 large-scale global optimization benchmarks, and it shows remarkable performance and outperforms the large-scale variants of the CMA-ES.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call