Abstract
We consider a class of constrained optimization problems where the objective function is a sum of a smooth function and a nonconvex non-Lipschitz function. Many problems in sparse portfolio selection, edge preserving image restoration, and signal processing can be modelled in this form. First, we propose the concept of the Karush--Kuhn--Tucker (KKT) stationary condition for the non-Lipschitz problem and show that it is necessary for optimality under a constraint qualification called the relaxed constant positive linear dependence (RCPLD) condition, which is weaker than the Mangasarian--Fromovitz constraint qualification and holds automatically if all the constraint functions are affine. Then we propose an augmented Lagrangian (AL) method in which the augmented Lagrangian subproblems are solved by a nonmonotone proximal gradient method. Under the assumption that a feasible point is known, we show that any accumulation point of the sequence generated by our method must be a feasible point. Moreover, if RCPL...
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.