Abstract

This article deals with a vector optimization problem with cone constraints in a Banach space setting. By making use of a real-valued Lagrangian and the concept of generalized subconvex-like functions, weakly efficient solutions are characterized through saddle point type conditions. The results, jointly with the notion of generalized Hessian (introduced in [Cominetti, R., Correa, R.: A generalized second-order derivative in nonsmooth optimization. SIAM J. Control Optim. 28, 789–809 (1990)]), are applied to achieve second order necessary and sufficient optimality conditions (without requiring twice differentiability for the objective and constraining functions) for the particular case when the functionals involved are defined on a general Banach space into finite dimensional ones.

Highlights

  • Introduction and Formulation of the ProblemIn many situations, practical or theoretical, finite dimensional spaces are not the most suitable ones in order to model or study a given problem

  • The development of optimality conditions for vectorial abstract programming problems is of great importance

  • Cominetti and Correa introduced in [8] the notions of second order directional derivative and generalized Hessian and gave some second order optimality conditions for an abstract scalar minimization problem

Read more

Summary

Introduction and Formulation of the Problem

Practical or theoretical, finite dimensional spaces are not the most suitable ones in order to model or study a given problem. As pointed by Cominetti and Correa [8], there are many usual techniques commonly employed in optimization that generate “nonsmoothness” even when the problems are differentiable This arises, for example, in duality theory, sensitivity and stability analysis, decomposition techniques, penalty methods, among others. With respect to necessary optimality conditions without differentiability, one can resort to those of Fritz John or Kuhn-Tucker type, which are obtained under various generalized derivatives concepts, or to saddle point conditions, where no differentiability assumption is required. Cominetti and Correa introduced in [8] the notions of second order directional derivative and generalized Hessian and gave some second order optimality conditions for an abstract scalar minimization problem. We establish saddle point type theorems for the vectorial optimization problem (P) and, in Section 4., we use these results to obtain second order conditions for problem (PF) In Section 3. we establish saddle point type theorems for the vectorial optimization problem (P) and, in Section 4., we use these results to obtain second order conditions for problem (PF)

Preliminaries
Generalized subconvex-like functions and a Gordan type alternative theorem
Second order generalized derivative and the generalized Hessian
Saddle Point Type Conditions
Second Order Conditions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.