Abstract

Metaheuristic optimization algorithms are strongly present in the literature on discrete optimization. They typically 1) use stochastic operators, making each run unique, and 2) often have algorithmic control parameters that have an unpredictable impact on convergence. Although both 1) and 2) affect algorithm performance, the effect of the control parameters is mostly disregarded in the literature on structural optimization, making it difficult to formulate general conclusions. In this article, a new method is presented to assess the performance of a metaheuristic algorithm in relation to its control parameter values. A Monte Carlo simulation is conducted in which several independent runs of the algorithm are performed with random control parameter values. In each run, a measure of performance is recorded. The resulting dataset is limited to the runs that performed best. The frequency of each parameter value occurring in this subset reveals which values are responsible for good performance. Importance sampling techniques are used to ensure that inferences from the simulation are sufficiently accurate. The new performance assessment method is demonstrated for the genetic algorithm in matlab R2018b, applied to seven common structural optimization test problems, where it successfully detects unimportant parameters (for the problems at hand) while identifying well-performing values for the important parameters. For two of the test problems, a better solution is found than the best solution reported so far in the literature.

Highlights

  • Metaheuristic algorithms are widespread in the literature on discrete and combinatorial optimization

  • A method has been developed to assess the performance of a metaheuristic algorithm in relation to the values of the control parameters

  • It is based on Monte Carlo sampling of independent algorithm runs and uses importance sampling as a variance reduction technique

Read more

Summary

INTRODUCTION

Metaheuristic algorithms are widespread in the literature on discrete and combinatorial optimization. Metaheuristic algorithms are typically controlled by a number of algorithmic parameters, which can be modified to tune the search procedure to the optimization problem at hand. These parameters are referred to as control parameters in this article. Data on optimal control parameter values across multiple studies might reveal search strategies that work well for a specific kind of optimization problem In this regard, Hooker (1995) made the distinction between competitive and scientific testing, which, in the context of metaheuristic algorithms, is still very relevant today. This article presents a new method, designed to assess the performance of a metaheuristic algorithm in relation to the values of its control parameters.

Early Work
Meta-Optimization
Model-free Algorithm Configuration
Parameter Tuning in Structural Optimization
List of Symbols
General Approach
Problem Definition
Monte Carlo Estimator
Importance Sampling
Ratio Estimator
Choosing an Appropriate Proposal Distribution
Updating the Proposal Distribution
The variance of the resulting estimator is approximated by
NUMERICAL EXPERIMENTS AND DISCUSSION
Impact of the Computational Budget
Comparison Between Multiple Test Problems
CONCLUSIONS
Findings
DATA AVAILABILITY STATEMENT

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.