Abstract

Seagull optimization algorithm (SOA) has the disadvantages of low convergence accuracy, weak population diversity, and tendency to fall into local optimum, especially for high dimensional and multimodal problems. To overcome these shortcomings, initially, in this study, a shared SOA (SSOA) is proposed based on the combination of a sharing multi-leader strategy with a self-adaptive mutation operator. In addition, seven new variants of the SSOA algorithm are proposed employing the Gaussian mutation operator, Cauchy mutation operator, Lévy flights mutation operator, improved Tent chaos mutation operator, neighborhood centroid opposition-based learning mutation operator, elite opposition-based learning mutation operator, and simulated annealing algorithm combined with other mutation operators, namely, GSSOA, CSSOA, LFSSOA, ITSSOA, ESSOA, NSSOA, and CMSSOA, respectively. Then, the performance of these variants was evaluated on 23 benchmark functions, and the various performances of the best variant were evaluated on a comprehensive set of 43 benchmark problems and three real-world problems compared to other optimizers. Experimental and statistical results demonstrate that the proposed CMSSOA algorithm outperforms other variants of the SSOA algorithm and competitor approaches.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.