Abstract

The purpose of this paper is to compare the performance of two global optimization methods very different in nature. Genetic Algorithms (GA) are stochastic global search procedures that only require objective function values. The Tunneling Algorithms (TA), classical and exponential, are deterministic, need gradient information requiring f to be smooth, f ∈ C 2, and depend only on one initial point. The results obtained here show that although GA converges very fast to lower levels of the objective function, it is extremely slow to finally get the desired global solution specially for larger dimensions. TA converge to the global solution, even for problems with a large number of local minima near the global level and larger dimensions, if enough time is given, for every initial point of the sample. Scaling the problem, showed to be extremely beneficial for TA, specially for larger dimensions, as the complexity of the objective function can be disminuished.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call