This paper explores the necessary conditions for extremum in both constrained and unconstrained problems by extracting fundamental principles of constraint conditions. We provide a precise geometric understanding of the Lagrange Multiplier, prioritizing analytical insight. Beginning with a geometric interpretation of the gradient, we leverage the expansion of functions and their images to comprehend extremum and detail the Lagrangian derivation process.We expand the base vectors of the constraint surface into those of the full space and use a transition matrix to assess the function's extremum. This demonstrates how the second derivative matrix is transformed into its full-space representation to discern extrema in optimization problems. Additionally, we introduce incremental variables to optimize the second-order derivative matrix in full space, providing a novel perspective to solve extremal necessary conditions.