Abstract
In this paper, we propose a novel algorithm that is based on quadratic-piecewise-linear approximations of DC functions to solve nonnegative sparsity-constrained optimization. A penalized DC (difference of two convex functions) formulation is proved to be equivalent to the original problem under a suitable penalty parameter. We employ quadratic-piecewise-linear approximations to the two parts of the DC objective function, resulting in a nonconvex subproblem. This is the key ingredient of our main algorithm. This nonconvex subproblem can be solved by a globally convergent alternating variable algorithm. Under some mild conditions, we prove that the proposed main algorithm for the penalized problem is globally convergent. Some preliminary numerical results on the sparse nonnegative least squares and logistic regression problems demonstrate the efficiency of our algorithm.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have