Abstract

The SIGEST paper in this issue, “Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent,” by Olivier Fercoq and Peter Richtárik, is an expanded version of their paper “Accelerated, Parallel, and Proximal Coordinate Descent” from the SIAM Journal on Optimization. Coordinate descent methods, which search for a lower value of the objective function by varying one or a small number of variables, are useful for problems so large that operations on full-sized vectors are impractical. If the vector of design variables will not fit in main memory, for example, then a full vector operation would require disk access. The paper considers a special class of convex optimization problems which has a structure that favors coordinate descent methods. The authors exploit this structure with a method they call APPROX (accelerated, parallel, and proximal). The structure of the problem facilitates parallelism. The method is accelerated in the sense that the convergence rate is proportional to the reciprocal of the square of the iteration number, rather than the reciprocal. Proximal methods penalize large differences between iterations. The authors have added a section on applications of the results in this expanded version. This new material appeared after the original paper and illustrates the utility of the theory very well. Interestingly, the section on applications is not at the end, but is a new second section which appears before the description of the algorithm. The advantage of this is that the reader gets an early view of how the problem class maps to applications, making it easier to appreciate the paper.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call