Abstract

In this paper, a new subspace minimization conjugate gradient method based on modified secant equation is proposed and analyzed. For a classical subspace minimization conjugate gradient method, the search direction is derived by minimizing an approximate quadratic model of objective function in a two-dimensional subspace. Generally, the approximate Hessian matrix in the above quadratic model is required to satisfy the standard secant equation, while we consider an approximate Hessian matrix which satisfies the modified secant equation. We give some rules such that if these rules are satisfied, we choose the standard secant equation, otherwise we choose the modified one. We can prove that the proposed directions satisfy the sufficient descent property under some extra conditions. We also present a modified nonmonotone Wolfe line search and establish the global convergence of the proposed method for general nonlinear functions under mild assumptions. Numerical comparisons are given with famous CG_DESCENT (5.3) (Hager and Zhang in SIAM J Optim 16(1):170–192, 2005) and SMCG_BB (Liu and Liu in J Optim Theory Appl 180(3):879–906, 2019), and show that the proposed algorithm is very promising.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call