Abstract

This paper suggests a new limited memory trust region algorithm for large unconstrained black box least squares problems, called LMLS. Main features of LMLS are a new non-monotone technique, a new adaptive radius strategy, a new Broyden-like algorithm based on the previous good points, and a heuristic estimation for the Jacobian matrix in a subspace with random basis indices. Our numerical results show that LMLS is robust and efficient, especially in comparison with solvers using traditional limited memory and standard quasi-Newton approximations.

Highlights

  • In this paper, we consider the unconstrained nonlinear least squares problem min f (x ):= E(x) 2 s.t. x ∈ Rn, (1)with high-dimensional x ∈ Rn and continuously differentiable E : Rn → Rr (r ≥ n), possibly expensive

  • To solve the least squares problem (1), trust region methods use linear approximations of the residual vectors to make surrogate quadratic models whose accuracy are increased by restricting their feasible points

  • We describe all steps of a new limited memory algorithm, called LMLS using the new subspace direction (7), the new non-monotone technique (10), the new adaptive radius strategy (11), and BroydenLike

Read more

Summary

Related work

There has been a huge amount of literature on least squares and its applications. There are many trust region methods using the finite difference method for the Jacobian matrix estimation such as CoDoSol and STRSCNE by Bellavia et al (2004, 2012), NMPNTR by Kimiaei (2016), NATRN and NATRLS by Amini et al (2016); Amini et al (2016), LSQNONLIN from the MATLAB Toolbox, NLSQERR (an adaptive trust region strategy) by Deuflhard (2011), and DOGLEG by Nielsen (2012) They are suitable for small- and medium-scale problems. To solve the least squares problem (1), trust region methods use linear approximations of the residual vectors to make surrogate quadratic models whose accuracy are increased by restricting their feasible points These methods use a computational measure to identify whether an agreement between an actual reduction of the objective function and a predicated reduction of surrogate quadratic model function is good or not. Trust region radii are reduced possibly many times, leading to the production of quite a small radius, or even a failure

Overview of the new method
A new subspace Gauss–Newton method
New non-monotone and adaptive strategies
A subspace dogleg algorithm
Broyden-like technique
A limited memory trust region algorithm
22: Update δ
Codes compared
Default for tuning parameters of LMLS
The efficiency and robustness of a solver
Summarizing tables
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call