Abstract

In this paper, we propose generalized splitting methods for solving a class of nonconvex optimization problems. The new methods are extended from the classic Douglas–Rachford and Peaceman–Rachford splitting methods. The range of the new step-sizes even can be enlarged two times for some special cases. The new methods can also be used to solve convex optimization problems. In particular, for convex problems, we propose more relax conditions on step-sizes and other parameters and prove the global convergence and iteration complexity without any additional assumptions. Under the strong convexity assumption on the objective function, the linear convergence rate can be derived easily.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call