Abstract

This paper studies constrained optimization problems in Banach spaces without usual differentiability and convexity assumptions on the functionals involved in the data. The aim is to give optimality conditions for the problems from which one can derive the characterization of best approximations. The objective and the inequality constraint functionals are assumed to have one-sided directional derivatives. First-order necessary conditions are given in terms of subdifferentials of the directional derivatives. The notion of “max-pseudoconvexity” weaker than pseudoconvexity is introduced for sufficiency. The optimality conditions are applied to linear and nonlinear Tchebycheff approximation problems to derive the characterization of best approximations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call