The inertial proximal method is extended to minimize the sum of a series of separable nonconvex and possibly nonsmooth objective functions and a smooth nonseparable function (possibly nonconvex). Here, we propose two new algorithms. The first one is an inertial proximal coordinate subgradient algorithm, which updates the variables by employing the proximal subgradients of each separable function at the current point. The second one is an inertial proximal block coordinate method, which updates the variables by using the subgradients of the separable functions at the partially updated points. Global convergence is guaranteed under the Kurdyka–Łojasiewicz (KŁ) property and some additional mild assumptions. Convergence rate is derived based on the Łojasiewicz exponent. Two numerical examples are given to illustrate the effectiveness of the algorithms.
Read full abstract