Abstract

Global optimization problem still becomes a challenges due to the problem on locating the global optimum of multimodal function. How to reach the better minimizer from the current minimizer and how to decide that the obtained minimizer is the desired one are both major challenges on solving global optimization problem. Filled function method is one of the recent considered deterministic easy applied methods which concerned to the mentioned problems. The basic concept of filled function method is firstly by minimizing the objective function (first phase) then to build such an auxiliary function which to be minimized (second phase) in order to locate a point with lower function value than the current minimizer of the objective function. In the second phase, a local minimization method can be applied. Newton’s method is considered to be fast method on finding the zero of gradient of quadratic function, but may be very expensive or infeasible to determine the Hessian matrix in the case of complex problems. The Jameson gradient based method is the search procedures which avoid the need to store an estimate of the Hessian as well as its inverse and do not require exact line searches. In this paper, an algorithm in cooperation of parameter free filled function method and Jameson gradient method are introduced for solving global optimization problem with two variables. The algorithm is implemented to some benchmark test function. The numerical performance of the method on solving two-dimensional global optimization problems is presented.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call