Abstract

In this paper, we present a new conjugate gradient method, in which the search direction is computed by minimizing a selected approximate model in a two-dimensional subspace. That is, if the objective function is not close to a quadratic, the search direction is generated by a conic model. Otherwise, a quadratic model is considered. The direction of the proposed method is proved to possess the sufficient descent property. With the modified nonmonotone line search, we establish a global convergence of the proposed method under appropriate assumptions. R-linear convergence of the proposed method is also analyzed. Numerical results using two different test function collections show that the proposed algorithm is efficient.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.