We address the problem of finding a local solution to a nonconvex–nonconcave minmax optimization using Newton type methods, including primal-dual interior-point ones. The first step in our approach is to analyze the local convergence properties of Newton’s method in nonconvex minimization. It is well established that Newton’s method iterations are attracted to any point with a zero gradient, irrespective of it being a local minimum. From a dynamical system standpoint, this occurs because every point for which the gradient is zero is a locally asymptotically stable equilibrium point. We show that by adding a multiple of the identity such that the Hessian matrix is always positive definite, we can ensure that every non-local-minimum equilibrium point becomes unstable (meaning that the iterations are no longer attracted to such points), while local minima remain locally asymptotically stable. Building on this foundation, we develop Newton-type algorithms for minmax optimization, conceptualized as a sequence of local quadratic approximations for the minmax problem. Using a local quadratic approximation serves as a surrogate for guiding the modified Newton’s method toward a solution. For these local quadratic approximations to be well-defined, it is necessary to modify the Hessian matrix by adding a diagonal matrix. We demonstrate that, for an appropriate choice of this diagonal matrix, we can guarantee the instability of every non-local-minmax equilibrium point while maintaining stability for local minmax points. Using numerical examples, we illustrate the importance of guaranteeing the instability property. While our results are about local convergence, the numerical examples also indicate that our algorithm enjoys good global convergence properties.
Read full abstract