Abstract

It is efficient to use the Hestenes–Stiefe (HS) conjugate gradient algorithm in solving large-scale complex smooth optimization problems because of its simplicity and low calculation requirements. Additionally, this algorithm has been used to simultaneously solve large-scale nonsmooth problems, nonlinear equations, and practical application. In this paper, the authors propose some modified HS conjugate gradient algorithms that not only address large-scale nonlinear equations and nonsmooth convex problems but also have the following properties. i) The algorithms portray a decreasing trait and trust-region property without any additional condition. ii) It combines the steepest descent algorithm with the conjugate gradient algorithm. iii) They employ the global convergence theory and iv) can be successfully used to solve nonlinear optimization problems and image restoration.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call