Abstract

In the smooth constrained optimization setting, this work introduces the Domain Complementary Approximate Karush–Kuhn–Tucker (DCAKKT) condition, inspired by a sequential optimality condition recently devised for non-smooth constrained optimization problems. It is shown that the augmented Lagrangian method can generate limit points satisfying DCAKKT, and it is proved that such a condition is not related to previously established sequential optimality conditions. An essential characteristic of the DCAKKT is to capture the asymptotic potential increasing of the Lagrange multipliers using a single parameter. Besides that, DCAKKT points satisfy the Strong Approximate Gradient Projection (SAGP) condition. Due to the intrinsic features of DCAKKT, which combine strength and generality, this novel and genuine sequential optimality condition may shed some light upon the practical performance of algorithms that are yet to be devised.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call