Abstract

Nature-inspired stochastic search techniques such as Evolutionary Algorithms (EAs) are known for their ability to solve complex optimization problems. However, they typically require numerous function evaluations in their search process to find global optimums. This is a drawback if EAs are used in optimization problems that have computationally expensive functions. Surrogate models have been used as cheap approximations to replace these functions. In this study, we propose a dynamically retrained multilayer perceptron-based surrogate model that is coupled with a genetic algorithm (GA) to reduce the number of function evaluations in the optimization process. The proposed method has been successfully applied to some test functions and real-world aerodynamic shape optimization problems. It is also shown that it converges more quickly towards the Pareto-optimal front with fewer function evaluations compared to a stand-alone GA in all optimization problems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call