The major stationary iterative method used to solve nonlinear optimization problems is the quasi-Newton (QN) method. Symmetric Rank-One (SR1) is a method in the quasi-Newton family. This algorithm converges towards the true Hessian fast and has computational advantages for sparse or partially separable problems [1]. Thus, investigating the efficiency of the SR1 algorithm is significant. It's possible that the matrix generated by SR1 update won't always be positive. The denominator may also vanish or become zero. To overcome the drawbacks of the SR1 method, resulting in better performance than the standard SR1 method, in this work, we derive a new vector <img src=image/13429423_01.gif> depending on the Barzilai-Borwein step size to obtain a new SR1 method. Then using this updating formula with preconditioning conjugate gradient (PCG) method is presented. With the aid of inexact line search procedure by strong Wolfe conditions, the new SR1 method is proposed and its performance is evaluated in comparison to the conventional SR1 method. It is proven that the updated matrix of the new SR1 method, <img src=image/13429423_02.gif>, is symmetric matrix and positive definite matrix, given <img src=image/13429423_03.gif> is initialized to identity matrix. In this study, the proposed method solved 13 problems effectively in terms of the number of iterations (NI) and the number of function evaluations (NF). Regarding NF, the new SR1 method also outperformed the classic SR1 method. The proposed method is shown to be more efficient in solving relatively large-scale problems (5,000 variables) compared to the original method. From the numerical results, the proposed method turned out to be significantly faster, effective and suitable for solving large dimension nonlinear equations.
Read full abstract