Abstract

When optimizing large-scale problems an evolutionary algorithm typically requires a substantial number of fitness evaluations to discover a good approximation to the global optimum. This is an issue when the problem is also computationally expensive. Surrogate-assisted evolutionary algorithms have shown better performance on high-dimensional problems which are no larger than 200 dimensions. However, it is very difficult to train sufficiently accurate surrogate models for a large-scale optimization problem due to the lack of training data. In this paper, a random feature selection technique is utilized to select decision variables from the original large-scale optimization problem to form a number of sub-problems, whose dimension may differ to each other, at each generation. The population employed to optimize the original large-scale optimization problem is updated by sequentially optimizing each sub-problem assisted by a surrogate constructed for this sub-problem. A new candidate solution of the original problem is generated by replacing the decision variables of the best solution found so far with those of the sub-problem that has achieved the best approximated fitness among all sub-problems. This new solution is then evaluated using the original expensive problem and used to update the best solution. In order to evaluate the performance of the proposed method, we conduct the experiments on 15 CEC’2013 benchmark problems and compare to some state-of-the-art algorithms. The experimental results show that the proposed method is more effective than the state-of-the-art algorithms, especially on problems that are partially separable or non-separable.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call