Abstract

Surrogate-assisted evolutionary algorithms (SAEAs) have recently shown excellent abilities in solving computationally expensive optimization problems (EOPs), but most of them are not designed for high-dimensional EOPs (HEOPs). A new algorithm where the gradient information is well-incorporated into dimension perturbation mutation called as GIDPM for HEOPs, is proposed and fully investigated in this paper. Specifically, for each solution, a local candidate offspring pool is obtained by combining the gradient descent information and dimension perturbation mutation, and the DE/current-to-rand/2/bin is employed to generate a global candidate offspring pool. Then an offspring selection strategy is used to identify the exact offspring solutions, in which the overall qualities of two pools, the directions of near gradient descent of candidate solutions, and the distributions and RBF predictions of candidate solutions are considered in turn, to achieve appropriate local exploitation or global exploration. In addition, an Autoencoder-embedded search strategy is used to quickly jump out of the local optima caused by some small ripples or oscillations of the landscape after each several iterations to achieve leapfrog search. The GIDPM maintains a good balance between exploration and exploitation. Experimental studies on benchmark problems demonstrate that the GIDPM achieves better convergence performance when the problem dimension is larger than or equal to 30, and the better performance of GIDPM can be achieved with the increase of dimension. The GIDPM is also used for the design of the sandwich panel with truss cores. Compared with gradient-based algorithm, the GIDPM obtains the design with better structural performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call