Abstract

In this paper, we propose a successive convex approximation framework for sparse optimization where the nondifferentiable regularization in the objective function is nonconvex and it can be written as the difference of two convex functions. The proposed framework is based on a nontrivial combination of the majorization-minimization method and successive convex approximation for nonconvex optimization where the regularization function is convex. The proposed framework is flexible and it leads to algorithms that exploit the problem structure and have a low complexity. We demonstrate these advantages by an example application where the nonconvex regularization is the capped $\ell_{1}$ -norm function. Customizing the proposed framework, we obtain a best-response type algorithm for which all elements of the unknown parameter are updated in parallel according to closed-form expressions. Finally, the proposed algorithms are numerically tested.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call