Abstract

In this paper, a three-parameter subspace conjugate gradient method is proposed for solving large-scale unconstrained optimization problems. By minimizing the quadratic approximate model of the objective function on a new special three-dimensional subspace, the embedded parameters are determined and the corresponding algorithm is obtained. The global convergence result of a given method for general nonlinear functions is established under mild assumptions. In numerical experiments, the proposed algorithm is compared with SMCG_NLS and SMCG_Conic, which shows that the given algorithm is robust and efficient.

Highlights

  • Subspace Conjugate GradientThe conjugate gradient method is one of the most important methods used for solving large-scale unconstrained problems, because of its simple structure, lower computation, storage, fast convergence, etc

  • The search direction is calculated by minimizing the quadratic approximation model on the two-dimensional subspace Ωk+1 = Span{ gk+1, sk }, namely dk+1 = μk gk+1 + νk sk, where μk and νk are parameters, and sk = xk+1 − xk

  • We compare the numerical performance of the TSCG algorithm with

Read more

Summary

Introduction

The conjugate gradient method is one of the most important methods used for solving large-scale unconstrained problems, because of its simple structure, lower computation, storage, fast convergence, etc. The search direction is calculated by minimizing the quadratic approximation model on the two-dimensional subspace Ωk+1 = Span{ gk+1 , sk }, namely dk+1 = μk gk+1 + νk sk , where μk and νk are parameters, and sk = xk+1 − xk. Inspired by SMCG, some researchers began to investigate the algorithm of the conjugate gradient method combined with subspace technology. Inspired by Andrei, Yang et al [16] carried out a similar study They applied the subspace minimization technique to another special three-dimensional subspace Ωk+1 = Span{ gk+1 , sk , sk−1 }, and obtained a new SMCG method (STT). Yang’s results, analyzed the more complex three parameters, and proposed a new subspace minimization conjugate gradient method (SMCG_NLS).

Direction Choice Model
Selection of Initial Step Size
The Obtained Algorithm
Descent Properties of Search Direction
Convergence Analysis
Numerical Results
Conclusions and Prospect
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call