Abstract

Polynomial regression (PR) and kriging are standard meta-model techniques used for approximate optimization (AO). Support vector regression (SVR) is a new meta-model technique with higher accuracy and a lower standard deviation than existing techniques. In this paper, we propose a sequential approximate optimization (SAO) method using SVR. Inherited latin hypercube design (ILHD) is used as the design of experiment (DOE), and the trust region algorithm is used as the model management technique, both adopted to increase efficiency in problem solving. We demonstrate the superior accuracy and efficiency of the proposed method by solving three mathematical problems and two engineering design problems. We also compare the proposed method with other meta-models such as kriging, radial basis function (RBF), and polynomial regression.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call