We present a sequential quadratic optimization (SQO) algorithm for nonlinear constrained optimization. The method attains all of the strong global and fast local convergence guarantees of classical SQO methods, but has the important additional feature that fast local convergence is guaranteed when the algorithm is employed to solve infeasible instances. A two-phase strategy, carefully constructed parameter updates, and a line search are employed to promote such convergence. The first phase subproblem determines the reduction that can be obtained in a local model of an infeasibility measure when the objective function is ignored. The second phase subproblem then seeks to minimize a local model of the objective while ensuring that the resulting search direction attains a reduction in the local model of the infeasibility measure that is proportional to that attained in the first phase. The subproblem formulations and parameter updates ensure that, near an optimal solution, the algorithm reduces to a classica...
Read full abstract