Abstract

Agent-based algorithms, based on the collective behavior of natural social groups, exploit innate swarm intelligence to produce metaheuristic methodologies to explore optimal solutions for diverse processes in systems engineering and other sciences. Especially for complex problems, the processing time, and the chance to achieve a local optimal solution, are drawbacks of these algorithms, and to date, none has proved its superiority. In this paper, an improved swarm optimization technique, named Grand Tour Algorithm (GTA), based on the behavior of a peloton of cyclists, which embodies relevant physical concepts, is introduced and applied to fourteen benchmarking optimization problems to evaluate its performance in comparison to four other popular classical optimization metaheuristic algorithms. These problems are tackled initially, for comparison purposes, with 1000 variables. Then, they are confronted with up to 20,000 variables, a really large number, inspired in the human genome. The obtained results show that GTA clearly outperforms the other algorithms. To strengthen GTA’s value, various sensitivity analyses are performed to verify the minimal influence of the initial parameters on efficiency. It is demonstrated that the GTA fulfils the fundamental requirements of an optimization algorithm such as ease of implementation, speed of convergence, and reliability. Since optimization permeates modeling and simulation, we finally propose that GTA will be appealing for the agent-based community, and of great help for a wide variety of agent-based applications.

Highlights

  • Many optimization problems in engineering are of a very complex nature and must be solved in accordance with various, sometimes complicated constraints

  • The metaheuristic we develop in this paper in particular, may have a number of benefits for many agent-based models (ABM) and applications

  • This score is a parameter considered for the CEC 2017 competition in [44] to compare different optimization algorithms, and is expressed by Equations (12) and (13)

Read more

Summary

Introduction

Many optimization problems in engineering are of a very complex nature and must be solved in accordance with various, sometimes complicated constraints. As a consequence, finding an optimal solution is often hard. A large number of mixed variables, and differential and nonlinear equations, are used to describe the problem. In many cases, classical optimization procedures based on differential methods cannot be used. Metaheuristic techniques arise to bridge this gap, as they can explore the search space for optimal and feasible solutions in a less restrictive (derivative-free) framework. The set of feasible solutions is infinite, metaheuristic algorithms use empirical iterative search methods, based on various heuristics, to guide the search in a way that the solution is expected to always improve

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.