Abstract

We present a new generating set search (GSS) approach for minimizing functions subject to linear constraints. GSS is a class of direct search optimization methods that includes generalized pattern search. One of our main contributions in this paper is a new condition to define the set of conforming search directions that admits several computational advantages. For continuously differentiable functions we also derive a bound relating a measure of stationarity, which is equivalent to the norm of the gradient of the objective in the unconstrained case, and a parameter used by GSS algorithms to control the lengths of the steps. With the additional assumption that the derivative is Lipschitz, we obtain a big-$O$ bound. As a consequence of this relationship, we obtain subsequence convergence to a KKT point, even though GSS algorithms lack explicit gradient information. Numerical results indicate that the bound provides a reasonable estimate of stationarity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call