Abstract
This paper investigates the integration of a surrogate-assisted multi-objective evolutionary algorithm (MOEA) and a parallel computation scheme to reduce the computing time until obtaining the optimal solutions in evolutionary algorithms (EAs). A surrogate-assisted MOEA solves multi-objective optimization problems while estimating the evaluation of solutions with a surrogate function. A surrogate function is produced by a machine learning model. This paper uses an extreme learning surrogate-assisted MOEA/D (ELMOEA/D), which utilizes one of the well-known MOEA algorithms, MOEA/D, and a machine learning technique, extreme learning machine (ELM). A parallelization of MOEA, on the other hand, evaluates solutions in parallel on multiple computing nodes to accelerate the optimization process. We consider a synchronous and an asynchronous parallel MOEA as a master-slave parallelization scheme for ELMOEA/D. We carry out an experiment with multi-objective optimization problems to compare the synchronous parallel ELMOEA/D with the asynchronous parallel ELMOEA/D. In the experiment, we simulate two settings of the evaluation time of solutions. One determines the evaluation time of solutions by the normal distribution with different variances. On the other hand, another evaluation time correlates to the objective function value. We compare the quality of solutions obtained by the parallel ELMOEA/D variants within a particular computing time. The experimental results show that the parallelization of ELMOEA/D significantly reduces the computational time. In addition, the integration of ELMOEA/D with the asynchronous parallelization scheme obtains higher quality of solutions quicker than the synchronous parallel ELMOEA/D.
Highlights
This paper investigates the integration of a surrogate-assisted multi-objective evolutionary algorithm (MOEA) and a parallel computation scheme to reduce the computing time until obtaining the optimal solutions in evolutionary algorithms (EAs)
This section firstly explains what multi-objective optimization problem is, and we show a detailed algorithm of MOEA/D (Zhang and Li 2007), which is an underlying algorithm of ELMOEA/D
We first explain the overview of extreme learning machine (ELM), and we provide the detail of ELMOEA/D
Summary
Previous works have proposed surrogate-assisted EAs (SAEAs) (Jin 2011; Haftka et al 2016) to reduce the computational time required for the EA process. The surrogate-assisted optimization obtains promising solutions that can exhibit superior actual evaluation value, and a time-consuming process evaluates them. It is generally assumed that the evaluation time of solutions is enormous and generally differs For this fact, we consider that the integration of an SAEA with the asynchronous parallelization scheme is effective. This paper conducts experiments that compare SPELMOEA/D with AP-ELMOEA/D to investigate which parallelization scheme is appropriate for ELMOEA/D We test these methods on well-known multi-objective optimization benchmarks, ZDT series (Zitzler et al 2000), WFG series (Huband et al 2005), and DTLZ series (Deb et al 2002).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.