Abstract

AbstractThe performance of most multi-objective evolutionary algorithms (MOEAs) usually degenerates when they are adopted to tackle large-scale multi-objective optimization problems (LSMOPs) involving a large number of decision variables. While LSMOPs attract increasing attention in the evolutionary computation community in recent years, a number of delicate approaches have been proposed to improve the performance of MOEAs on LSMOPs. While many real-world LSMOPs include sparse optimal solutions (i.e., most decision variables are zero) that should be found within a limited budget of function evaluations, few MOEAs have been tailored for these sparse LSMOPs and the performance of existing MOEAs on them has not been well studied. In this paper, we first have a brief review of existing MOEAs for LSMOPs, and then elaborate two customized MOEAs for solving sparse LSMOPs. Next, we select six state-of-the-art MOEAs with different search strategies to conduct an experimental study on eight sparse benchmark problems and four real-world applications. Finally, we outline some future research directions of large-scale optimization involving sparsity.KeywordsLarge-scale multi-objective optimizationSparse Pareto optimal solutionsEvolutionary algorithmReal-world applications

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call