In this paper, we study the convergence of algorithms for a class of nonconvex and nonsmooth optimization problems via an level-set subdifferential error bound. Many algorithms (cf. forward-backward splitting algorithm and alternating proximal minimization algorithm) satisfy sufficient descent and relative error conditions. We also find that these algorithms satisfy a quadratic error condition. Under these conditions and the level-set subdifferential error bound, we study the global convergence and the convergence rates of these algorithms without the Kurdyka-Łojasiewicz property which is stronger than the level-set subdifferential error bound. Two examples are given to illustrate the broader applicability of our findings across these algorithms.
Read full abstract