Abstract

Separable nonlinear models are very common in various research fields, such as machine learning and system identification. The variable projection (VP) approach is efficient for the optimization of such models. In this paper, we study various VP algorithms based on different matrix decompositions. Compared with the previous method, we use the analytical expression of the Jacobian matrix instead of finite differences. This improves the efficiency of the VP algorithms. In particular, based on the modified Gram-Schmidt (MGS) method, a more robust implementation of the VP algorithm is introduced for separable nonlinear least-squares problems. In numerical experiments, we compare the performance of five different implementations of the VP algorithm. Numerical results show the efficiency and robustness of the proposed MGS method-based VP algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call