Abstract
GOMORS is a parallel response surface-assisted evolutionary algorithm approach to multi-objective optimization that is designed to obtain good non-dominated solutions to black box problems with relatively few objective function evaluations. GOMORS uses Radial Basic Functions to iteratively compute surrogate response surfaces as an approximation of the computationally expensive objective function. A multi objective search utilizing evolution, local search, multi method search and non-dominated sorting is done on the surrogate radial basis function surface because it is inexpensive to compute. A balance between exploration, exploitation and diversification is obtained through a novel procedure that simultaneously selects evaluation points within an algorithm iteration through different metrics including Approximate Hypervolume Improvement, Maximizing minimum domain distance, Maximizing minimum objective space distance, and surrogate-assisted local search, which can be computed in parallel. The results are compared to ParEGO (a kriging surrogate method solving many weighted single objective optimizations) and the widely used NSGA-II. The results indicate that GOMORS outperforms ParEGO and NSGA-II on problems tested. For example, on a groundwater PDE problem, GOMORS outperforms ParEGO with 100, 200 and 400 evaluations for a 6 dimensional problem, a 12 dimensional problem and a 24 dimensional problem. For a fixed number of evaluations, the differences in performance between GOMORS and ParEGO become larger as the number of dimensions increase. As the number of evaluations increase, the differences between GOMORS and ParEGO become smaller. Both surrogate-based methods are much better than NSGA-II for all cases considered.
Highlights
Multi-objective optimization (MO) approaches involve a large number of function evaluations, which make it difficult to use MO in simulation–optimization problems where the optimization is multi-objective and the nonlinear simulation is computationally expensive and has multiple local minima
This paper focuses on the use of radial basis functions (RBF) and evolutionary algorithms for multi-objective optimization of computationally expensive problems, where the number of function evaluations are limited relative to the problem dimension
ParEGO is not designed for high dimensional problems
Summary
Multi-objective optimization (MO) approaches involve a large number of function evaluations, which make it difficult to use MO in simulation–optimization problems where the optimization is multi-objective and the nonlinear simulation is computationally expensive and has multiple local minima (multi modal). Regis and Shoemaker [36] were the first to use Radial Basis Functions (not a neural net) to improve the efficiency of an evolutionary algorithm with limited numbers of evaluations Later they introduced a non-evolutionary algorithm Stochastic-RBF [37] , which is a very effective radial basis function-based method for single objective optimization of expensive global optimization problems. An added purpose of the investigation is to be able to solve MO problems where the number of decision variables varies between 15 and 25 To this effect we propose a new algorithm, GOMORS, that combines radial basis function approximation with multi-objective evolutionary optimization, within the general iterative framework of surrogate-assisted heuristic search algorithms. The selection strategy incorporates the power of local search and ensures that the algorithm can be used in a parallel setting to further improve its efficiency
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have