Abstract
In this paper, we propose a novel algorithm that is based on quadratic-piecewise-linear approximations of DC functions to solve nonnegative sparsity-constrained optimization. A penalized DC (difference of two convex functions) formulation is proved to be equivalent to the original problem under a suitable penalty parameter. We employ quadratic-piecewise-linear approximations to the two parts of the DC objective function, resulting in a nonconvex subproblem. This is the key ingredient of our main algorithm. This nonconvex subproblem can be solved by a globally convergent alternating variable algorithm. Under some mild conditions, we prove that the proposed main algorithm for the penalized problem is globally convergent. Some preliminary numerical results on the sparse nonnegative least squares and logistic regression problems demonstrate the efficiency of our algorithm.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.