The spectral conjugate gradient methods, with simple construction and nice numerical performance, are a kind of effective methods for solving large-scale unconstrained optimization problems. In this paper, based on quasi-Newton direction and quasi-Newton condition, and motivated by the idea of spectral conjugate gradient method as well as Dai-Kou's selecting technique for conjugate parameter [SIAM J. Optim. 23 (2013), pp. 296–320], a new approach for generating spectral parameters is presented, where a new double-truncating technique, which can ensure both the sufficient descent property of the search directions and the bounded property of the sequence of spectral parameters, is introduced. Then a new associated spectral conjugate gradient method for large-scale unconstrained optimization is proposed. Under either the strong Wolfe line search or the generalized Wolfe line search, the proposed method is always globally convergent. Finally, a large number of comparison numerical experiments on large-scale instances from one thousand to two million variables are reported. The numerical results show that the proposed method is more promising.
Read full abstract