AbstractIn this paper, we develop a splitting algorithm incorporating Bregman distances to solve a broad class of linearly constrained composite optimization problems, whose objective function is the separable sum of possibly nonconvex nonsmooth functions and a smooth function, coupled by a difference of functions. This structure encapsulates numerous significant nonconvex and nonsmooth optimization problems in the current literature including the linearly constrained difference-of-convex problems. Relying on the successive linearization and alternating direction method of multipliers (ADMM), the proposed algorithm exhibits the global subsequential convergence to a stationary point of the underlying problem. We also establish the convergence of the full sequence generated by our algorithm under the Kurdyka–Łojasiewicz property and some mild assumptions. The efficiency of the proposed algorithm is tested on a robust principal component analysis problem and a nonconvex optimal power flow problem.
Read full abstract