Abstract

In this paper, we present a new conjugate gradient method, in which the search direction is computed by minimizing a selected approximate model in a two-dimensional subspace. That is, if the objective function is not close to a quadratic, the search direction is generated by a conic model. Otherwise, a quadratic model is considered. The direction of the proposed method is proved to possess the sufficient descent property. With the modified nonmonotone line search, we establish a global convergence of the proposed method under appropriate assumptions. R-linear convergence of the proposed method is also analyzed. Numerical results using two different test function collections show that the proposed algorithm is efficient.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call