Abstract

Metaheuristic algorithms, which are developed to find near-optimal solutions to optimization problems within acceptable times, have specific parameters that have a significant impact on their performance. And fine-tuning these parameters can lead to an effective and good-performing version of such algorithms. We propose a novel algorithm configuration method based on Latin Hypercube Hammersley Sampling (LHHS) and Fuzzy C-means Clustering (FCM) methods. Usage of these methods is first in the algorithm configuration literature. We evaluated the proposed tuning method in two experiments and four cases and compared its performance with the current state-of-the-art automatic parameter tuning methods. In the first experiment, we tune three numerical parameters of the Standard Genetic Algorithm (SGA) and in the second experiment, we tune two numerical parameters of the Artificial Bee Colony (ABC) algorithm. Our experimental results show the proposed tuning method showed better performance than other state-of-the-art tuning methods in two cases. And demonstrated competitive performance with the other algorithm configuration methods for the other two cases. The most important result of our experiments is that not only the best configuration but also other configurations that remained in the final configuration set demonstrated competitive results with the best configuration found with other state-of-the-art parameter tuning methods. However, because the proposed method is designed to find the best-performing configurations among a large initial configuration set using all the available tuning budgets, it requires more computational time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call