Abstract
This paper proposes a new nonmonotone adaptive trust region line search method for solving unconstrained optimization problems, and presents a modified trust region ratio, which obtained more reasonable consistency between the accurate model and the approximate model. The approximation of Hessian matrix is updated by the modified BFGS formula. Trust region radius adopts a new adaptive strategy to overcome additional computational costs at each iteration. The global convergence and superlinear convergence of the method are preserved under suitable conditions. Finally, the numerical results show that the proposed method is very efficient.
Highlights
Consider the following unconstrained optimization problem min f (x), (1)x∈Rn where f : Rn → R is a twice continuously differentiable function
The basic idea of trust region methods as follows: at the current step xk, the trial step dk is obtained by solving the subproblem: min d∈Rn mk (d)
D ≤ k, where fk = f, gk = ∇f, Gk = ∇2f, Bk be a symmetric approximation of Gk, k is trust region radius, and · is the Euclidean norm
Summary
X∈Rn where f : Rn → R is a twice continuously differentiable function. Trust region method is one of prominent class of iterative methods. On the basis of considered discussion, at each iteration, a trial step dk is obtained by solving the following trust region subproblem:. Proof According to Step 4 of Algorithm 2.1, the trust region radius satisfies k ≥ ck gk Bk β1pk gk Bk β1pk gk M1. Lemma 3.8 Suppose that Assumption 2.1 holds and the sequence {xk} is generated by Algorithm 2.1, we have, lim k→∞. Proof From Lemma 3.3, we know that Algorithm 2.1 generates an infinite sequence {xk} satisfying ρk ≥ μ1, we obtain, fl(k) – f (xk + dk) fl(k) – fk – mk(dk). On the basis of the above lemmas and analysis, we can obtain the global convergence result of Algorithm 2.1 as follows: Theorem 3.1 (Global convergence) Suppose that Assumption 2.1 holds and the sequence {xk} is generated by Algorithm 2.1.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have