Abstract

In this chapter, a global optimization problem is considered where both the objective function f(x) and its gradient ∇f(x) are multidimensional black-box functions. It is supposed that ∇f(x) satisfies the Lipschitz condition over the search hyperinterval with an unknown Lipschitz constant K. Different techniques for estimating K are presented and their advantages and disadvantages are emphasized. In what regards exploring the multidimensional search domain, several adaptive partitioning strategies are discussed that can be applied in Lipschitz global optimization methods: (1) one-point-based algorithms evaluating the objective function and its gradient at one point within each subregion; (2) diagonal partitions where f(x) and ∇f(x) are evaluated at two points within each subregion; (3) more complex partitions based, for instance, on simplices or auxiliary functions of various nature. This chapter deals with diagonal deterministic methods that show a promising performance both in tests and applications. Several geometric methods based on diagonal partitions and auxiliary functions are presented and compared on eight hundred of differentiable problems randomly produced by the GKLS-generator of classes of test functions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call