Abstract

Convex nonsmooth optimization problems, whose solutions live in very high dimensional spaces, have become ubiquitous. To solve them, the class of first-order algorithms known as proximal splitting algorithms is particularly adequate: they consist of simple operations, handling the terms in the objective function separately. In this overview, we demystify a selection of recent proximal splitting algorithms: we present them within a unified framework, which consists in applying splitting methods for monotone inclusions in primal-dual product spaces, with well-chosen metrics. Along the way, we easily derive new variants of the algorithms and revisit existing convergence results, extending the parameter ranges in several cases. In particular, we emphasize that when the smooth term in the objective function is quadratic, e.g., for least-squares problems, convergence is guaranteed with larger values of the relaxation parameter than previously known. Such larger values are usually beneficial for the convergence speed in practice.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.