Abstract
Bayesian Global Optimization (BGO) (also referred to as Bayesian Optimization, or Efficient Global Optimization (EGO)), uses statistical models—typically Gaussian process regression to approximate an expensive objective function. Based on this prediction an infill criterion is formulated that takes into account the expected value and variance. BGO adds a new point at the position where this infill criterion obtains its optimum. In this chapter, we will review different ways to formulate such infill criteria. A focus will be on approaches that measure improvement utilizing integrals or statistical moments of a probability distribution over the non-dominated space, including the probability of improvement and the expected hypervolume improvement, and upper quantiles of the hypervolume improvement. These criteria require the solution of non-linear integral calculations. Besides summarizing the progress in the computation of such integrals, we will present new, efficient, procedures for the high dimensional expected improvement and probability of improvement. Moreover, the chapter will summarize main properties of these infill criteria, including continuity and differentiability as well as monotonicity properties of the variance and mean value. The latter will be necessary for constructing global optimization algorithms for non-convex problems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.