Abstract

Nonlinear rescaling (NR) methods alternate finding an unconstrained minimizer of the Lagrangian for the equivalent problem in the primal space (which is an infinite procedure) with Lagrange multipliers update. <br> &nbsp; &nbsp We introduce and study a proximal point nonlinear rescaling (PPNR) method that preserves convergence and retains a linear convergence rate of the original NR method and at the same time does not require an infinite procedure at each step.<br> &nbsp; &nbsp The critical component of our analysis is the equivalence of the NR method with dynamic scaling parameter update to the interior quadratic proximal point method for the dual problem in the rescaled from step to step dual space.<br> &nbsp; &nbsp By adding the classical quadratic proximal term to the primal objective function the PPNR step can be viewed as a primal-dual proximal point mapping. This allows analyzing a wide variety of non-quadratic augmented Lagrangian methods from unique and general point of view using tools typical for the classical quadratic proximal-point technique.<br> &nbsp; &nbsp We proved convergence of the primal-dual PPNR sequence under minimum assumptions on the input data and established a $q$-linear rate of convergence under the standard second-order optimality conditions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call