Solving a complex optimization task from scratch can be significantly expensive and/or time-consuming. Common knowledge obtained from different (but possibly related) optimization tasks may help enhance the solving of such tasks. In this regard, evolutionary multitasking optimization (EMTO) has been proposed to improve the solving of multiple optimization tasks simultaneously via knowledge transfer in the evolutionary algorithm framework. The effectiveness of knowledge transfer is crucial for the success of EMTO. Multifactorial evolutionary algorithm (MFEA) is one of the most representative EMTO algorithms, however, it suffers from negative knowledge transfer among the tasks with low correlation. To address this issue, in this study, inter-task gene-similarity-based knowledge transfer and mirror transformation are integrated into MFEA (termed as MFEA-GSMT). In the proposed inter-task gene-similarity-based knowledge transfer, a probabilistic model is used to feature each gene and the Kullback-Leibler divergence is employed to measure the inter-task dimension similarity. Guided by the inter-task gene similarity, a selective crossover is used to reproduce offspring solutions. The proposed inter-task knowledge transfer is based on online gene similarity evaluation, instead of individual similarity, to overcome the imprecise estimation of population distributions in a high-dimensional space with only a small number of samples. The proposed mirror transformation is an extension of opposition-based learning to avoid premature convergence and explore additional promising search areas. Experimental results on both single-objective and multi-objective multi-tasking problems demonstrate the effectiveness and efficiency of the proposed MFEA-GSMT.
Read full abstract