Abstract

Separable nonlinear models are very common in various research fields, such as machine learning and system identification. The variable projection (VP) approach is efficient for the optimization of such models. In this paper, we study various VP algorithms based on different matrix decompositions. Compared with the previous method, we use the analytical expression of the Jacobian matrix instead of finite differences. This improves the efficiency of the VP algorithms. In particular, based on the modified Gram-Schmidt (MGS) method, a more robust implementation of the VP algorithm is introduced for separable nonlinear least-squares problems. In numerical experiments, we compare the performance of five different implementations of the VP algorithm. Numerical results show the efficiency and robustness of the proposed MGS method-based VP algorithm.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.