Abstract

Dynamic multi-objective optimization problems (DMOPs) can be utilized to model certain real-world problems that have a dynamic nature. Algorithms for solving DMOPs can be evaluated and improved by comparing their performance on different benchmarks. However, some existing benchmarks for DMOPs have the limitation of non-uniform weights for decision variables. Additionally, dynamic many-objective optimization problems (DMaOPs) involve more than three objectives, but only a few existing benchmarks can be extended to accommodate DMaOPs. Furthermore, some existing performance measures for DMOPs may not effectively compare the relative performance differences between multiple algorithms or evaluate the search uniformity among different objectives. In this paper, we propose improvements to an existing benchmark for DMOPs by expanding the impact range of decision variables. Moreover, a benchmark framework that can be extended to accommodate DMaOPs is proposed, thus addressing a research gap between the optimization of DMOPs and DMaOPs. Additionally, a set of performance measures for DMOPs are proposed, which can evaluate the relative performance and search uniformity of multi-objective optimization algorithms. By comparing the performance of state-of-the-art and commonly used algorithms on test problems, we can gain a better understanding of the characteristics and strengths and weaknesses of the algorithms and test problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call