Abstract

Seagull optimization algorithm (SOA) has the disadvantages of low convergence accuracy, weak population diversity, and tendency to fall into local optimum, especially for high dimensional and multimodal problems. To overcome these shortcomings, initially, in this study, a shared SOA (SSOA) is proposed based on the combination of a sharing multi-leader strategy with a self-adaptive mutation operator. In addition, seven new variants of the SSOA algorithm are proposed employing the Gaussian mutation operator, Cauchy mutation operator, Lévy flights mutation operator, improved Tent chaos mutation operator, neighborhood centroid opposition-based learning mutation operator, elite opposition-based learning mutation operator, and simulated annealing algorithm combined with other mutation operators, namely, GSSOA, CSSOA, LFSSOA, ITSSOA, ESSOA, NSSOA, and CMSSOA, respectively. Then, the performance of these variants was evaluated on 23 benchmark functions, and the various performances of the best variant were evaluated on a comprehensive set of 43 benchmark problems and three real-world problems compared to other optimizers. Experimental and statistical results demonstrate that the proposed CMSSOA algorithm outperforms other variants of the SSOA algorithm and competitor approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call