Abstract
Sparse large scale multiobjective optimization problems (sparse LSMOPs) contain numerous decision variables, and their Pareto optimal solutions' decision variables are very sparse (i.e., the majority of these solutions' decision variables are zero-valued). This poses grand challenges to an algorithm in converging to the Pareto set. Numerous evolutionary algorithms (EAs) tailored for sparse LSMOPs have been proposed in recent years. However, the final population generated by these EAs is not sparse enough because the location of the nonzero decision variables is difficult to locate accurately and there is insufficient interaction between the nonzero decision variables' locating process and the nonzero decision variables' optimizing process. To address this issue, we propose a dynamic sparse grouping evolutionary algorithm (DSGEA) that dynamically groups decision variables in the population that have a comparable amount of nonzero decision variables. Improved evolutionary operators are introduced to optimize the decision variables in groups. As a result, the population obtained by DSGEA can stably evolve towards the sparser Pareto optimal that has a precise location of nonzero decision variables. The proposed algorithm outperforms existing up-to-date EAs for sparse LSMOPs in experiments on three real-world problems and eight benchmark problems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.