Abstract

Recently, multi-tasking optimization (MTO) has become a rising research topic in the field of evolutionary computation that has attracted increasing attention of academia. Comparing with single-objective optimization (SOO) and multi-objective optimization (MOO), MTO can solve different optimization tasks simultaneously by utilizing inter-task similarities and complementarities. Based on crossover operator, the classical multifactorial evolutionary algorithm (MFEA) transfers inter-task knowledge. To broaden the search region and accelerate the convergence, this paper integrates differential evolution (DE) and opposition-based learning (OBL) into MFEA and hence proposes MFEA/DE-OBL. The motivation of integrating DE and OBL is that they have different search neighborhoods and strong complementarity with simulated binary crossover (SBX) used in MFEA. Furthermore, integrating DE and OBL can help MFEA jump out of local optima. The effectiveness and efficiency of integrating DE and OBL into MFEA are experimentally studied on a set of benchmark problems with different degrees of similarities. Experimental results demonstrate that the proposed MFEA/DE-OBL dramatically improves the performance compared with the MFEA.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call