Abstract

Variants of the Newton method are very popular for solving unconstrained optimization problems. The study on global convergence of the BFGS method has also made good progress. The q-gradient reduces to its classical version when q approaches 1. In this paper, we propose a quantum-Broyden–Fletcher–Goldfarb–Shanno algorithm where the Hessian is constructed using the q-gradient and descent direction is found at each iteration. The algorithm presented in this paper is implemented by applying the independent parameter q in the Armijo–Wolfe conditions to compute the step length which guarantees that the objective function value decreases. The global convergence is established without the convexity assumption on the objective function. Further, the proposed method is verified by the numerical test problems and the results are depicted through the performance profiles.

Highlights

  • Several numerical methods have been developed extensively for solving unconstrained optimization problems

  • The global convergence of the BFGS method have been studied by several authors [5, 12, 19,20,21] under the convexity assumption on the objective function

  • Li et al concerned with the open problem of whether the BFGS method with inexact line search converges globally when applied to non-convex unconstrained optimization problems [23]

Read more

Summary

Introduction

Several numerical methods have been developed extensively for solving unconstrained optimization problems. The global convergence of the BFGS method have been studied by several authors [5, 12, 19,20,21] under the convexity assumption on the objective function. A modified BFGS method was developed to converge globally without a convexity assumption on the objective function [23].

Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call