Abstract

Multi-Objective and Many-objective Optimization problems have been extensively solved through evolutionary algorithms over a few decades. Despite the fact that NSGA-II and NSGA-III are frequently employed as a reference for a comparative evaluation of new evolutionary algorithms, the latter is proprietary. In this paper, we used the basic framework of the NSGA-II, which is very similar to the NSGA-III, with significant changes in its selection operator. We took the first front generated at the non-dominating sort procedure to obtain nonnegative and nonrepeated extreme points. This opensource version of the NSGA-III is called EF1-NSGA-III, and its implementation does not start from scratch; that would be reinventing the wheel. Instead, we took the NSGA-II code from the authors in the repository of the Kanpur Genetic Algorithms Laboratory to extend the EF1-NSGA-III. We then adjusted its selection operator from diversity, based on the crowding distance, to the one found on reference points and preserved its parameters. After that, we continued with the adaptive EF1-NSGA-III (A-EF1-NSGA-III), and the efficient adaptive EF1-NSGA-III (A2-EF1-NSGA-III), while also contributing to explain how to generate different types of reference points. The proposed algorithms resolve optimization problems with constraints of up to 10 objective functions. We tested them on a wide range of benchmark problems, and they showed notable improvements in terms of convergence and diversity by using the Inverted Generational Distance (IGD) and the HyperVolume (HV) performance metrics. The EF1-NSGA-III aims to resolve the power consumption for Centralized Radio Access Networks and the BiObjective Minimum DiameterCost Spanning Tree problems.

Highlights

  • Genetic algorithms (GAs) are random-based evolutionary methods

  • When we study GAs, we contemplate an initial population of pseudorandom solutions that pass through genetic functions such as selection, crossover, and mutation to recombine and perturb solutions

  • We show the procedures for the EF1NSGA-III, except the associate one that is the same as the one from the NSGA-III authors

Read more

Summary

Introduction

Genetic algorithms (GAs) are random-based evolutionary methods. They are preferred over classical optimization. GAs evolved to Multi-Objective and Many-objective Evolutionary Algorithms (MOEAs and MaOEAs) to optimize Multi-Objective or Many-Objective Optimization Problems (MOOPs and MaOPs) for fields such as engineering, business, mathematics, and physics (Li, Wang, Zhang, and Ishibuchi, 2018) They search for multiple solutions simultaneously on different non-convex and discontinuous regions next to the approximated Pareto front. The practical motivation of this paper is the implementation of an open-source version of the proprietary NSGA-III (Deb and Jain, 2014; Jain and Deb, 2014) called the EF1-NSGA-III (Ariza, 2019) that alleviates the above-mentioned issue of dimensionality This algorithm solves problems with more than two objective functions, checks the feasibility of the population to fill the non-dominated sort procedure, and uses the first front to generate non-negative and non-repeated extreme points during the normalization procedure.

Related work
20: Associate each member s of St with a reference point:
Move reference points from the inside layer to the boundary layer
Move adaptive reference points to the crowded reference point jth
Move efficient adaptive reference points to the crowded reference point jth
6: Normalize objectives fn:
Objective
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call