Abstract

In this paper, we propose a new primal–dual algorithm for minimizing $$f({\mathbf {x}})+g({\mathbf {x}})+h({\mathbf {A}}{\mathbf {x}})$$ , where f, g, and h are proper lower semi-continuous convex functions, f is differentiable with a Lipschitz continuous gradient, and $${\mathbf {A}}$$ is a bounded linear operator. The proposed algorithm has some famous primal–dual algorithms for minimizing the sum of two functions as special cases. E.g., it reduces to the Chambolle–Pock algorithm when $$f=0$$ and the proximal alternating predictor–corrector when $$g=0$$ . For the general convex case, we prove the convergence of this new algorithm in terms of the distance to a fixed point by showing that the iteration is a nonexpansive operator. In addition, we prove the O(1 / k) ergodic convergence rate in the primal–dual gap. With additional assumptions, we derive the linear convergence rate in terms of the distance to the fixed point. Comparing to other primal–dual algorithms for solving the same problem, this algorithm extends the range of acceptable parameters to ensure its convergence and has a smaller per-iteration cost. The numerical experiments show the efficiency of this algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call