Abstract

The spectral conjugate gradient methods are fascinating, and it has been shown that they are useful for strictly convex quadratic reduction when used properly. To handle large-scale unconstrained optimization issues, a novel spectral conjugate gradient approach is suggested in this study. We devise a new methodology for determining the spectral and conjugate parameters, motivated by the benefits of the approximate optimum step size strategy utilized in the gradient method. Additionally, the new search direction meets the spectral property as well as the sufficient descent criterion. The presented method's global convergence is established under a set of appropriate assumptions. Consider the unconstrained optimization problem with the following n variables : min⁡f(x) ,x∈R^n (1) The conjugate gradient methods are among the most effective optimization strategies for achieving the solution of problem (1), where f:R^n→R is a continuous differentiable function. The conjugate gradient technique has the following form : x_(k+1)=x_k+α_k d_k ,k=0,1,2,3,… (2) Where x_0 is the starting point, α_k is a step size , g_k=∇f(x) and d_k can be taken as : d_k={█(-g_k ∶ k=0@-g_k+β_k d_(k-1) ∶ k≥1)┤ (3)

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call