Abstract

Optimization is now considered a branch of computational science. This ethos seeks to answer the question «what is best?» by looking at problems where the quality of any answer can be expressed numerically. One of the most well-known methods for solving nonlinear, unrestricted optimization problems is the conjugate gradient (CG) method. The Hestenes and Stiefel (HS-CG) formula is one of the century’s oldest and most effective formulas. When using an exact line search, the HS method achieves global convergence; however, this is not guaranteed when using an inexact line search (ILS). Furthermore, the HS method does not always satisfy the descent property. The goal of this work is to create a new (modified) formula by reformulating the classic parameter HS-CG and adding a new term to the classic HS-CG formula. It is critical that the proposed method generates sufficient descent property (SDP) search direction with Wolfe-Powell line (sWPLS) search at every iteration, and that global convergence property (GCP) for general non-convex functions can be guaranteed. Using the inexact sWPLS, the modified HS-CG (mHS-CG) method has SDP property regardless of line search type and guarantees GCP. When using an sWPLS, the modified formula has the advantage of keeping the modified scalar non-negative sWPLS. This paper is significant in that it quantifies how much better the new modification of the HS performance is when compared to standard HS methods. As a result, numerical experiments between the mHSCG method using the sWPL search and the standard HS optimization problem show that the CG method with the mHSCG conjugate parameter is more robust and effective than the CG method without the mHSCG parameter

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call