Abstract

This paper proposes a Sequential Model Based Optimization framework for solving optimization problems characterized by a black-box, multi-extremal, expensive and partially defined objective function, under unknown constraints. This is a typical setting for simulation-optimization problems, where the objective function cannot be computed for some configurations of the decision/control variables due to the violation of some (unknown) constraint. The framework is organized in two consecutive phases, the first uses a Support Vector Machine classifier to approximate the boundary of the unknown feasible region within the search space, the second uses Bayesian Optimization to find a globally optimal (feasible) solution. A relevant difference with traditional Bayesian Optimization is that the optimization process is performed on the estimated feasibility region, only, instead of the entire search space. Some results on three 2D test functions and a real case study for the Pump Scheduling Optimization in Water Distribution Networks are reported. The proposed framework proved to be more effective and efficient than Bayesian Optimization approaches using a penalty for function evaluations outside the feasible region.

Highlights

  • Sequential model based optimization (SMBO), and more precisely Bayesian Optimization (BO) [1], is a global optimization approach which has been shown to be effective and sample efficient in the case of black-box expensive objective functions [2–11]

  • The Support Vector Machine (SVM)-CBO approach was initially validated on a set of simple 2D test functions [48]; this paper presents a wider set of experiments on test functions generated through the Emmental-type GKLS generator [49], with increasing dimensionality and complexity

  • The SVM-CBO approach has been validated on three well-known 2D test functions for constrained global optimization (CGO), that are: Rosenbrock constrained to a disk [55], Rosenbrock constrained to a line and a cubic [55, 56], and Mishra’s Bird constrained [57]

Read more

Summary

Introduction

Sequential model based optimization (SMBO), and more precisely Bayesian Optimization (BO) [1], is a global optimization approach which has been shown to be effective and sample efficient in the case of black-box expensive objective functions [2–11]. A derivative-free extension of [23], based on DIRECT, has been proposed in [24] and [25] making the approach suitable for globally solving constrained problems where the derivatives are not available Another derivativefree approach, based on Differential Search, has been proposed in [26], based on the idea of exact penalty function and using a dynamical penalty factor to achieve a better trade-off between exploration and exploitation. The basic idea of Support Vector Machine (SVM) classification [42, 43] consists in searching for a hyper-plane to optimally separate instances (represented as vectors) belonging to two different classes while maximizing the distance of every instance from the hyper-plane (i.e., margin maximization) This first formulation is known as hard margin SVM and provides a separation of instances in the two classes without any classification error only in the case that the instances are linearly separable.

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.