Abstract

In this article, we consider a class of nonsmooth, nonconvex, and non-Lipschitz optimization problems, which have wide applications in sparse optimization. We generalize the Clarke stationary point and define a kind of generalized stationary point of the problems with a stronger optimal capability. Based on the smoothing method, we propose a projected neural network for solving this kind of optimization problem. Under the condition that the level set of objective function in the feasible region is bounded, we prove that the solution of the proposed neural network is globally existent and bounded. The uniqueness of the solution of the proposed network is also analyzed. When the feasible region is bounded, any accumulation point of the proposed neural network is a generalized stationary point of the optimization model. Based on some suitable conditions, any solution of the proposed neural network is asymptotic convergent to one stationary point. In particular, we give some deep analysis on the proposed network for solving a special class of the non-Lipschitz optimization problem, which indicates a lower bound property and the unify identification for the nonzero elements of all accumulation points. Finally, some numerical results are presented to show the efficiency of the proposed neural network for solving some kinds of sparse optimization models.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.