Abstract
In this paper, we present two families of modified three-term conjugate gradient methods for solving unconstrained large-scale smooth optimization problems. We show that our new families satisfy the Dai-Liao conjugacy condition and the sufficient descent condition under any line search technique which guarantees the positiveness of ${y_{k}^{T}} s_{k}$. For uniformly convex functions, we indicate that our families are globally convergent under weak-Wolfe-Powell line search technique and standard conditions on the objective function. We also establish a weaker global convergence theorem for general smooth functions under similar assumptions. Our numerical experiments for 260 standard problems and seven other recently developed conjugate gradient methods illustrate that the members of our families are numerically efficient and effective.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.