Abstract

Consider the problem of minimizing, over a polyhedral set, the composition of an affine mapping with a strictly convex essentially smooth function. A general result on the linear convergence of descent methods for solving this problem is presented. By applying this result, the linear convergence of both the gradient projection algorithm of Goldstein and Levitin and Polyak, and a matrix splitting algorithm using regular splitting, is established. The results do not require that the cost function be strongly convex or that the optimal solution set be bounded. The key to the analysis lies in a new error bound for estimating the distance from a feasible point to the optimal solution set.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call