Abstract

The central idea of differential calculus is that the derivative of a function defines the best local linear approximation to the function near a given point. This basic idea, together with some representation theorems from linear algebra, unifies the various derivatives—gradients, Jacobians, Hessians, and so forth—encountered in engineering and optimization. The basic differentiation rules presented in calculus classes, notably the product and chain rules, allow the computation of the gradients and Hessians needed by optimization algorithms, even when the underlying operators are quite complex. Examples include the solution operators of time-dependent and steady-state partial differential equations. Alternatives to the hand-coding of derivatives are finite differences and automatic differentiation, both of which save programming time at the possible cost of run-time efficiency.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.