Abstract

Multi-Objective Evolutionary Algorithms (MOEAs) have been applied successfully for solving real-world multi-objective problems. Their success can depend highly on the configuration of their control parameters. Different tuning methods have been proposed in order to solve this problem. Tuning can be performed on a set of problem instances in order to obtain robust control parameters. However, for real-world problems, the set of problem instances at our disposal usually are not very plentiful. This raises the question: What is a sufficient number of problems used in the tuning process to obtain robust enough parameters? To answer this question, a novel method called MOCRS-Tuning was applied on different sized problem sets for the real-world integration and test order problem. The configurations obtained by the tuning process were compared on all the used problem instances. The results show that tuning greatly improves the algorithms’ performance and that a bigger subset used for tuning does not guarantee better results. This indicates that it is possible to obtain robust control parameters with a small subset of problem instances, which also substantially reduces the time required for tuning.

Highlights

  • Setting control parameters of Evolutionary Algorithms (EAs) remains one of the biggest challenges in evolutionary computation

  • We investigated how the size of the problem set impacts the tuning of Multi-Objective Evolutionary Algorithms (MOEAs)

  • The tuning was performed with MOCRS-Tuning, which uses a jDE and a chess rating system with a quality indicator ensemble to find the best configuration for the given MOEA

Read more

Summary

Introduction

Setting control parameters of Evolutionary Algorithms (EAs) remains one of the biggest challenges in evolutionary computation. Control parameters have a great impact on the performance of Evolutionary Algorithms; a poor setting can even make it impossible to solve the given problem at hand [1]. To avoid tuning on individual problem instances, it can be carried-out on only a subset of instances This provides us with robust control parameters, making the tuned algorithm a generalist [4]. MOCRS-Tuning uses a meta-evolutionary approach to find the best configuration (control parameters) for the given MOEA. The EARS framework was used to compare the configurations obtained by tuning an MOEA on differently sized problem sets.

Related Work
Integration and Testing Order
Chess Rating System for Evolutionary Algorithms
MOCRS-Tuning Method
Experiment
Results and Discussion
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call