Abstract

We find that finding the unconstrained optimizer for large-scale problems with the Newton method can be very expensive. Therefore, in this paper, we developed a new efficient method for solving large-scale unconstrained optimization problems with block diagonal Hessian matrices to reduce the cost of computing Newton’s direction. This method is a combination of the Newton method and the 2-point explicit group successive-over relaxation (2EGSOR) block iterative method. To calculate the performance of the developed method, we used a combination of the Newton method with Gauss–Seidel (Newton-GS) point iteration and the Newton method with successive-over relaxation (Newton-SOR) point iteration as reference methods. The numerical experiment has proven that the developed algorithms generate results that are more efficient compared to the reference methods with less execution time and fewer iterations. Through 90 test cases, we observe that the speedup ratio for our proposed method is up to 659.87 times faster compared to the Newton-SOR method and up to 1963.57 times more rapidly than the Newton-GS method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call