Abstract

Error bound analysis, which estimates the distance of a point to the solution set of an optimization problem4 using the optimality residual, is a powerful tool for the analysis of first-order optimization algorithms. In this paper, we use global error bound analysis to study the iteration complexity of a first-order algorithm for a linearly constrained nonconvex minimization problem. We develop a global dual error bound analysis for a regularized version of this nonconvex problem by using a novel “decomposition” technique. Equipped with this global dual error bound, we prove that a suitably designed primal-dual first-order method can generate an $\epsilon$-stationary solution of the linearly constrained nonconvex minimization problem within $\mathcal{O}(1/\epsilon^2)$ iterations, which is the best known iteration complexity for this class of nonconvex problems. The iteration complexity of our algorithm for finding an $\epsilon$-stationary solution is $\mathcal{O}(1/\epsilon^2)$, which improves the best known complexity of $\mathcal{O}(1/\epsilon^3)$ for the problem under consideration. Furthermore, when the objective function is quadratic, we establish the linear convergence of the algorithm. Our proof is based on a new potential function and a novel use of error bounds.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call