Abstract

Derivative-free optimization focuses on designing methods to solve optimization problems without the analytical knowledge of the function. In this paper, we consider the problem of designing derivative-free methods for finite minimax problems: min x max i=1, 2, …, N <texlscub>f i (x)</texlscub>. In order to solve the problem efficiently, we seek to exploit the smooth substructure within the problem. Using ideas developed by Burke et al. [J.V. Burke, A.S. Lewis, and M.L. Overton, Approximating subdifferentials by random sampling of gradients, Math. Oper. Res. 27(3) (2002), pp. 567–584; J.V. Burke, A.S. Lewis, and M.L. Overton, A robust gradient sampling algorithm for nonsmooth, nonconvex optimization, SIAM J. Optim. 15(3) (2005), pp. 751–779 (electronic)], we create the idea of a robust simplex gradient descent direction and use it to accelerate convergence. Convergence is proven by showing that the resulting algorithm fits into the directional direct-search framework. Numerical tests demonstrate the algorithm's effectiveness on finite minimax problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call