Abstract

In this paper, a surrogate-based global optimization algorithm HSOSR is presented, which can solve expensive black-box optimization problems with box constraints. In order to decrease difficulty of the search in large-scale multimodal problems, a space reduction method based on hybrid surrogates is proposed. Kriging and Radial Basis Function (RBF) are employed to approximate the true expensive problems, respectively. A large number of samples are generated by Latin hypercube sampling to obtain the predictive values from the two surrogates. According to the size of these predictive values from kriging and RBF, all the samples are sorted, respectively. Subsequently, two potentially better regions from kriging and RBF are identified based on the ranks of these samples and two subspaces are also created. Since kriging and RBF models always produce multiple predictive optimal solutions, a multi-start optimization algorithm is proposed to capture the supplementary samples in the two subspaces alternately. Besides, the newly added samples need to satisfy a defined distance criterion for sampling diversity. Once the algorithm gets stuck in a local valley, the estimated mean square error of kriging is maximized by the multi-start optimization strategy to explore the sparsely sampled area. Eventually, 10 low-dimensional and 5 high-dimensional benchmark cases are used to test HSOSR. In addition, 5 surrogate-based global optimization algorithms are also tested as contrast. Compared with the well-known efficient global optimization (EGO) method, HSOSR achieves an improvement of more than 50% on computational efficiency. To sum up, HSOSR has the high efficiency and strong robustness in dealing with multimodal expensive black-box optimization problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call