Abstract
In this paper, a three-parameter subspace conjugate gradient method is proposed for solving large-scale unconstrained optimization problems. By minimizing the quadratic approximate model of the objective function on a new special three-dimensional subspace, the embedded parameters are determined and the corresponding algorithm is obtained. The global convergence result of a given method for general nonlinear functions is established under mild assumptions. In numerical experiments, the proposed algorithm is compared with SMCG_NLS and SMCG_Conic, which shows that the given algorithm is robust and efficient.
Highlights
Subspace Conjugate GradientThe conjugate gradient method is one of the most important methods used for solving large-scale unconstrained problems, because of its simple structure, lower computation, storage, fast convergence, etc
The search direction is calculated by minimizing the quadratic approximation model on the two-dimensional subspace Ωk+1 = Span{ gk+1, sk }, namely dk+1 = μk gk+1 + νk sk, where μk and νk are parameters, and sk = xk+1 − xk
We compare the numerical performance of the TSCG algorithm with
Summary
The conjugate gradient method is one of the most important methods used for solving large-scale unconstrained problems, because of its simple structure, lower computation, storage, fast convergence, etc. The search direction is calculated by minimizing the quadratic approximation model on the two-dimensional subspace Ωk+1 = Span{ gk+1 , sk }, namely dk+1 = μk gk+1 + νk sk , where μk and νk are parameters, and sk = xk+1 − xk. Inspired by SMCG, some researchers began to investigate the algorithm of the conjugate gradient method combined with subspace technology. Inspired by Andrei, Yang et al [16] carried out a similar study They applied the subspace minimization technique to another special three-dimensional subspace Ωk+1 = Span{ gk+1 , sk , sk−1 }, and obtained a new SMCG method (STT). Yang’s results, analyzed the more complex three parameters, and proposed a new subspace minimization conjugate gradient method (SMCG_NLS).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.