Abstract

In this paper, we consider the minimization of a class of nonconvex composite functions with difference of convex structure under linear constraints. While this kind of problems in theory can be solved by the celebrated alternating direction method of multipliers (ADMM), a direct application of ADMM often leads to difficult nonconvex subproblems. To address this issue, we propose to convexify the subproblems through a linearization technique as done in the difference of convex functions algorithm (DCA). By assuming the Kurdyka-Łojasiewicz property, we prove that the resulting algorithm sequentially converges to a critical point. It turns out that in the applications of signal and image processing such as compressed sensing and image denoising, the proposed algorithm usually enjoys closed-form solutions of the subproblems and thus can be very efficient. We provide numerical experiments to demonstrate the effectiveness of our algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call