Abstract

<span>The conjugate gradient method has played a special role in solving large-scale unconstrained Optimization problems. In this paper, we propose a new family of CG coefficients that possess sufficient descent conditions and global convergence properties this CG method is similar to (Wei et al) [7]. Global convergence result is established under Strong Wolf-Powell line search. Numerical results to find the optimum solution of some test functions show the new proposed formula has the best result in CPU time and the number of iterations, and the number of gradient evaluations when it comparing with FR, PRP, DY, and WYL </span>

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.