Abstract

We present a group update algorithm based on truncated trust region strategy for large-scale sparse unconstrained optimization. In large sparse optimization computing the whole Hessian matrix and solving exactly the Newton-like equations at each iteration can be considerably expensive. By the method the elements of the Hessian matrix are updated successively and periodically via groups during iterations and an inaccurate solution to the Newton-like equations is obtained by truncating the inner iteration under certain control rule. Besides, we allow that the current direction exceeds the trust region bound if it is a good descent direction satisfying some descent conditions. Some good convergence properties are kept and we contrast the computational behavior of our method with that of other algorithms. Our numerical tests show that the algorithm is promising and quite effective, and that its performance is comparable to or better than that of other algorithms available.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call