Abstract

Many fitness evaluations are often needed for large-scale evolutionary optimization to find the optimal solution. Therefore, evolutionary algorithms are impeded to solve computationally expensive problems. Surrogate assisted evolutionary algorithms (SAEAs) have been shown to have good capability in a finite computational budget. However, not many SAEAs, have been proposed for large-scale expensive problems. The main reason is that a proper surrogate model is challenging to be trained due to the curse of dimension. In this paper, we propose to employ the random grouping technique to divide a large-scale optimization problem into several low-dimensional sub-problems. Then a surrogate ensemble is trained for each sub-problem to assist the sub-problem optimization. The next parent population for large-scale optimization will be generated by the horizontal composition of the populations for sub-problem optimization. Furthermore, the best solution found so far for the sub-problem with the best population mean fitness value will be used to replace the best solution found so far for the large-scale problem on its corresponding dimensions, and the new solution will be evaluated using the expensive objective function. The experimental results on CEC’2013 benchmark problems show that the proposed method is effective and efficient for solving large-scale expensive optimization problems.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.