Abstract

Sparse optimization involving the L0-norm function as the regularization in objective function has a wide application in many fields. In this paper, we propose a projected neural network modeled by a differential equation to solve a class of these optimization problems, in which the objective function is the sum of a nonsmooth convex loss function and the regularization defined by the L0-norm function. This optimization problem is not only nonconvex, but also discontinuous. To simplify the structure of the proposed network and let it own better convergence properties, we use the smoothing method, where the new constructed smoothing function for the regularization term plays a key role. We prove that the solution to the proposed network is globally existent and unique, and any accumulation point of it is a critical point of the continuous relaxation model. Except for a special case, which can be easily justified, any critical point is a local minimizer of the considered sparse optimization problem. It is an interesting thing that all critical points own a promising lower bound property, which is satisfied by all global minimizers of the considered problem, but is not by all local minimizers. Finally, we use some numerical experiments to illustrate the efficiency and good performance of the proposed method for solving this class of sparse optimization problems, which include the most widely used models in feature selection of classification learning.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.