Abstract

<p style='text-indent:20px;'>The primal-dual hybrid gradient method and the primal-dual algorithm proposed by Chambolle and Pock are both efficient methods for solving saddle point problem. However, the convergence of both methods depends on some assumptions which can be too restrictive or impractical in real applications. In this paper, we propose a new parameter condition for the primal-dual hybrid gradient method. This improvement only requires either the primal or the dual objective function to be strongly convex. The relaxed parameter condition leads to convergence acceleration. Although counter-example shows that the PDHG method is not necessarily convergent with constant step size, it becomes convergent with our relaxed parameter condition. Preliminary experimental results show that PDHG method with our relaxed parameter condition is more efficient than several state-of-art methods.</p>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call