Abstract

Abstract We review the extended Jacobian approach to automatic differentiation of a user-supplied function and highlight the Schur complement form’s forward and reverse variants. We detail a Matlab operator overloaded approach to construct the extended Jacobian that enables the function Jacobian to be computed using Matlab’s sparse matrix operations. Memory and runtime costs are reduced using a variant of the hoisting technique of Bischof (Issues in Parallel Automatic Differentiation, 1991). On five of the six mesh-based gradient test problems from The MINPACK2 Test Problem Collection (Averick et al, 1992) the reverse variant of our extended Jacobian technique with hoisting outperforms the sparse storage forward mode of the MAD package (Forth, ACM T. Math. Software. 32, 2006). For increasing problems size the ratio of gradient to function cpu time is seen to be bounded, if not decreasing, in line with Griewank and Walther’s (Evaluating Derivatives, SIAM, 2008) cheap gradient principle.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call