Abstract
In the study of optimization problems it is necessary to consider functions that are not differentiable. This has led to the consideration of generalized gradients and a corresponding calculus for certain classes of functions. Rockafellar [16] and others have developed a very strong and elegant theory of subgradients for convex functions. This convex theory gives point-wise criteria for the existence of extrema in optimization problems.There are however many optimization problems that involve functions which are neither differentiable nor convex. Such functions arise in many settings including optimal value functions [15]. In order to deal with such problems Clarke [3] defined a type of subgradient for nonconvex functions. This definition was initially for Lipschitz functions on R”. Clarke extended this definition to include lower semicontinuous (l.s.c.) functions on Banach spaces through the use of a directional derivative, the distance function from a closed set and tangent and normal cones to closed sets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.