Abstract

Blackbox optimization tackles problems where the functions are expensive to evaluate and where no analytical information is available. In this context, a tried and tested technique is to build surrogates of the objective and the constraints in order to conduct the optimization at a cheaper computational cost. This work introduces an extension to a specific type of surrogates: ensembles of surrogates, enabling them to quantify the uncertainty on the predictions they produce. The resulting extended ensembles of surrogates behave as stochastic models and allow the use of efficient Bayesian optimization tools. The method is incorporated in the search step of the mesh adaptive direct search (MADS) algorithm to improve the exploration of the search space. Computational experiments are conducted on seven analytical problems, two multi-disciplinary optimization problems and two simulation problems. The results show that the proposed approach solves expensive simulation-based problems at a greater precision and with a lower computational effort than stochastic models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call