In Optimization Theory, necessary and sufficient optimality conditions play an essential role. They allow, first of all, checking whether a point under study satisfies the conditions, and, secondly, if it does not, finding a "better" point. For the class of directionally differentiable functions, a necessary condition for an unconstrained minimum requires the directional derivative to be non-negative in all directions. This condition becomes efficient for special classes of directionally differentiable functions. For example, in the case of convex and max-type functions, the necessary condition for a minimum takes the form of inclusion. The problem of verifying this condition is reduced to that of finding the point of some convex and compact set C which is nearest to the origin. If the origin does not belong to C, we easily find the steepest descent direction, and are able to construct a numerical method. In the classical Chebyshev polynomial approximation problem, necessary optimality conditions are expressed in the form of alternation of signs of some values. In the present paper, a generalization of the alternance approach to a general optimization problem is given. Two equivalent forms of the alternance condition (the so-called inside form and the outside one) are discussed in detail. In some cases, it may be more convenient to use the conditions in the form of inclusion, in some other--the condition in the alternance form as in the Chebyshev approximation problem. Numerical methods based on the condition in the form of inclusion usually are "gradient-type" methods, while methods employing the alternance form are often "Newton-type". It is hoped that in some cases it will be possible to enrich the existing armory of optimization algorithms by a new family of efficient tools. In the paper, we discuss only unconstrained optimization problems in the finite-dimensional setting. In many cases, a constrained optimization problem can be reduced (via Exact Penalization Techniques) to an unconstrained one.