Abstract
AbstractNonlinear least squares iterative solver is considered for real-valued sufficiently smooth functions. The algorithm is based on successive solution of orthogonal projections of the linearized equation on a sequence of appropriately chosen low-dimensional subspaces. The bases of the latter are constructed using only the first-order derivatives of the function. The technique based on the concept of the limiting stepsize along normalized direction (developed earlier by the author) is used to guarantee the monotone decrease of the nonlinear residual norm. Under rather mild conditions, the convergence to zero is proved for the gradient and residual norms. The results of numerical testing are presented, including not only small-sized standard test problems, but also larger and harder examples, such as algebraic problems associated with canonical decomposition of dense and sparse 3D tensors as well as finite-difference discretizations of 2D nonlinear boundary problems for 2nd order partial differential equations.
Highlights
Application areas of nonlinear least squares are numerous and include, for instance, numerical solution of nonlinear equations arising as discrete models of physical problems, acceleration of neural networks learning processes using Levenberg-Marquardt type algorithms, pattern recognition, signal processing, nonlinear system modeling and control, design of new fast matrix algorithms etc
The technique based on the concept of the limiting stepsize along normalized direction is used to guarantee the monotone decrease of the nonlinear residual norm
The results of numerical testing are presented, including small-sized standard test problems, and larger and harder examples, such as algebraic problems associated with canonical decomposition of dense and sparse 3D tensors as well as finitedifference discretizations of 2D nonlinear boundary problems for 2nd order partial differential equations
Summary
Application areas of nonlinear least squares are numerous and include, for instance, numerical solution of nonlinear equations arising as discrete models of physical problems, acceleration of neural networks learning processes using Levenberg-Marquardt type algorithms, pattern recognition, signal processing, nonlinear system modeling and control, design of new fast matrix algorithms etc. This explains the need in further development of robust and efficient nonlinear least squares solvers
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.