Abstract

The aim of this paper is to present a novel, transparent approach to a well-established field: the deep methods and applications of the complete analysis of continuous optimization problems. Standard descents give a unified approach to all standard necessary conditions, including the Lagrange multiplier rule, the Karush–Kuhn–Tucker conditions and the second order conditions. Nonstandard descents lead to new necessary conditions. These can be used to give surprising proofs of deep central results of fields that are generally viewed as distinct from optimization: the fundamental theorem of algebra, the maximum and the minimum principle of complex function theory, the separation theorems for convex sets, the orthogonal diagonalization of symmetric matrices and the implicit function theorem. These optimization proofs compare favorably with the usual proofs and are all based on the same strategy. This paper is addressed to all practitioners of optimization methods from many fields who are interested in fully understanding the foundations of these methods and of the central results above.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call