The accelerated proximal gradient (APG) method, first proposed by Nesterov for minimizing smooth convex functions, later extended by Beck and Teboulle to composite convex objective functions, and studied in a unifying manner by Tseng, has proven to be highly efficient in solving some classes of large scale structured convex optimization (possibly nonsmooth) problems, including nuclear norm minimization problems in matrix completion and $l_1$ minimization problems in compressed sensing. The method has superior worst-case iteration complexity over the classical projected gradient method and usually has good practical performance on problems with appropriate structures. In this paper, we extend the APG method to the inexact setting, where the subproblem in each iteration is solved only approximately, and show that it enjoys the same worst-case iteration complexity as the exact counterpart if the subproblems are progressively solved to sufficient accuracy. We apply our inexact APG method to solve large scale convex quadratic semidefinite programming (QSDP) problems of the form $\min\{ \frac{1}{2}\langle x,\,\mathcal{Q}(x)\rangle + \langle c,\,x\rangle\mid \mathcal{A}(x) = b, x\succeq 0\}$, where $\mathcal{Q},\mathcal{A}$ are given linear maps and $b,c$ are given data. The subproblem in each iteration is solved by a semismooth Newton-CG (SSNCG) method with warm-start using the iterate from the previous iteration. Our APG-SSNCG method is demonstrated to be efficient for QSDP problems whose positive semidefinite linear maps $\mathcal{Q}$ are highly ill-conditioned or rank deficient.
Read full abstract