Abstract
This paper presents a new approach to Automatic Differentiation (AD) for a scalar valued and twice continuouslydifferentiable function f : R^n - R. A new arithmetic is obtained based on the chain rule and usingaugmented algebra of real numbers. The chain rule based differentiation arithmetic is used to find the Gradientand Hessian. Jacobian is evaluated using Gradient arithmetic by computing Gradient for components and is arrangedin matrix form to give Jacobian value. The resulting derivative evaluation uses the operator overloadingconcept which uses computer programs written in C++.
Highlights
Any efficient non-linear optimization routine needs good gradient approximations
A new arithmetic is obtained based on the chain rule and using augmented algebra of real numbers through forward mode of Automatic Differentiation
If the independent variables xiof a formula for a function f : Rn → R and x → f (x) are replaced by Xi = (xi, e(i), 0), and if all constants c are replaced by their (c, 0, 0) representation, evaluation of f using the rules of differentiation arithmetic gives f (X) =
Summary
Any efficient non-linear optimization routine needs good gradient approximations. Several research groups have developed the technique of Automatic Differentiation, which generates exact derivatives for a given code segment. A comprehensive introduction to this method can be found in (Griewank, A., 2000 & 1990; Naumann, U., 2008; Moore, R.E., 1962; Rall, L.B., 2007). Automatic Differentiation can be implemented in various ways, each of which is dependent on circumstances partially. We use a new methodology to implement AD for computing Gradient, Hessian and Jacobian. A new arithmetic is obtained based on the chain rule and using augmented algebra of real numbers through forward mode of Automatic Differentiation
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have