Abstract

We consider the problem of computing derivatives of an objective that is defined using implicit functions; i.e., implicit variables are computed by solving equations that are often nonlinear and solved by an iterative process. If one were to apply Algorithmic Differentiation (AD) directly, one would differentiate the iterative process. In this paper we present the Newton step methods for computing derivatives of the objective. These methods make it easy to take advantage of sparsity, forward mode, reverse mode, and other AD techniques. We prove that the partial Newton step method works if the number of steps is equal to the order of the derivatives. The full Newton step method obtains two derivatives order for each step except for the first step. There are alternative methods that avoid differentiating the iterative process; e.g., the method implemented in ADOL-C. An optimal control example demonstrates the advantage of the Newton step methods when computing both gradients and Hessians. We also discuss the Laplace approximation method for nonlinear mixed effects models as an example application.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.