Abstract

Recent theoretical and practical investigations have shown that the Gauss-Newton algorithm is the method of choice for the numerical solution of nonlinear least squares parameter estimation problems. It is shown that when line searches are included, the Gauss-Newton algorithm behaves asymptotically like steepest descent, for a special choice of parameterization. Based on this a conjugate gradient acceleration is developed. It converges fast also for those large residual problems, where the original Gauss-Newton algorithm has a slow rate of convergence. Several numerical test examples are reported, verifying the applicability of the theory.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call