Abstract
The Gauss-Newton algorithm is often used to minimize a nonlinear least-squares loss function instead of the original Newton-Raphson algorithm. The main reason is the fact that only first-order derivatives are needed to construct the Jacobian matrix. Some applications as, for instance multivariable system identification, give rise to weighted nonlinear least-squares problems for which it can become quite hard to obtain an analytical expression of the Jacobian matrix. To overcome that struggle, a pseudo-Jacobian matrix is introduced, which leaves the stationary points untouched and can be calculated analytically. Moreover, by slightly changing the pseudo-Jacobian matrix, a better approximation of the Hessian can be obtained resulting in faster convergence.
Accepted Version
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have