Abstract

Surrogate models used in evolutionary algorithms (EAs) aim to reduce computationally expensive objective function evaluations. However, low-quality surrogates may mislead EAs and as a result, surrogate-assisted EAs may fail to locate the global optimum. Among various machine learning models for surrogates, Gaussian Process (GP) models have shown to be effective as GP models are able to provide fitness estimation as well as a confidence level. One weakness of GP models is that the computational cost for training increases rapidly as the number of training samples increases. To reduce the computational cost for training, here we propose to adopt an ensemble of local Gaussian Process models. Different from independent local Gaussian Process models, local Gaussian Process models share the same model parameters. Then the performance of the covariance matrix adaptation evolution strategy (CMA-ES) assisted by an ensemble of local Gaussian Process models with five different sampling strategies is compared. Experiments on eight benchmark functions demonstrate that ensembles of local Gaussian Process models can provide reliable fitness prediction and uncertainty estimation. Among the compared strategies, the clustering technique using the lower confidence bound sampling strategy exhibits the best global search performance.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.