Abstract

We analyze several generic proximal splitting algorithms well suited for large-scale convex nonsmooth optimization. We derive sublinear and linear convergence results with new rates on the function value suboptimality or distance to the solution, as well as new accelerated versions, using varying stepsizes. In addition, we propose distributed variants of these algorithms, which can be accelerated as well. While most existing results are ergodic, our nonergodic results significantly broaden our understanding of primal–dual optimization algorithms.

Highlights

  • We propose new algorithms for the generic convex optimization problem:

  • Where M ≥ 1 is typically the number of parallel computing nodes in a distributed setting; the Km: X → Um are linear operators; X and Um are real Hilbert spaces; R and Hm are proper, closed, convex functions with values in R ∪ {+∞}, the proximity operators of which are easy to compute; and the Fm are convex LFm-smooth functions; that is ∇Fm is LFm-Lipschitz continuous, for some LFm > 0. This template problem covers most convex optimization problems met in signal and image processing, operations research, control, machine learning, and many other fields, and our goal is to propose new generic distributed algorithms able to deal with nonsmooth functions using their proximity operators, with acceleration in presence of strong convexity

  • Our contributions are the following: (1) New algorithms: We propose the first distributed algorithms to solve (Eq 1) in whole generality, with proved convergence to an exact solution, and having the full splitting, or decoupling, property: ∇Fm, proxHm, Km and K*m are applied at the m-th node, and the proximity operator of R

Read more

Summary

INTRODUCTION

We propose new algorithms for the generic convex optimization problem:. :. Where M ≥ 1 is typically the number of parallel computing nodes in a distributed setting; the Km: X → Um are linear operators; X and Um are real Hilbert spaces (all spaces are supposed of finite dimension); R and Hm are proper, closed, convex functions with values in R ∪ {+∞}, the proximity operators of which are easy to compute; and the Fm are convex LFm-smooth functions; that is ∇Fm is LFm-Lipschitz continuous, for some LFm > 0 This template problem covers most convex optimization problems met in signal and image processing, operations research, control, machine learning, and many other fields, and our goal is to propose new generic distributed algorithms able to deal with nonsmooth functions using their proximity operators, with acceleration in presence of strong convexity

Contributions
Related Work
Organization of the paper
MINIMIZATION OF 3 FUNCTIONS WITH A LINEAR OPERATOR
Deriving the Nonstationary PD3O and PDDY Algorithms
Convergence Analysis
DISTRIBUTED PROXIMAL ALGORITHMS
Image Deblurring Regularized With Total Variation
Image Deblurring Regularized With
SVM With Hinge Loss
DERIVATION OF THE ALGORITHMS
The Davis–Yin Algorithm
The PD3O Algorithm
The PDDY Algorithm
The Distributed PD3O Algorithm and its Particular Cases
The Distributed PDDY Algorithm
The Distributed Condat–Vũ Algorithm
Findings
DATA AVAILABILITY STATEMENT
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.