Abstract

This paper gives approaches to solving large-scale sparse unconstrained optimization based on a successive partitioning group correction algorithm. In large-scale optimization, solving the Newton-like equations at each iteration can be expensive and may not be justified when far from a solution. Instead, an inaccurate solution to the Newton-like equations is computed using a conjugate gradient method. Besides, the methods also depend on a symmetric consistent partition of the columns of the Hessian matrix. A q-superlinear convergence result and an r-convergence rate estimate show that the methods have good local convergence properties. Global convergence is proven and the numerical results show that the methods may be competitive with some current used algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call