Abstract

Optimization algorithms aim to find the optimum values that give the maximum or minimum result of a function under given circumstance. There are many approaches to solve optimization problems. Stochastic population-based optimization approaches tend to give the best results in a reasonable time. Two of the state-of-art stochastic optimization algorithms are Genetic Algorithms (GA) and Particle Swarm Optimization(PSO). In addition, Sine-Cosine Algorithm is one of the recently developed stochastic population-based optimization algorithms. It is claimed that Sine-Cosine has a higher speed than the counterparts of it. Moreover, Sine-Cosine Algorithm occasionally outperforms other optimization algorithms including GA and PSO. This algorithm is successful because it can balance exploration and exploitation smoothly. In the previous studies, the above-mentioned algorithms were evaluated and compared to each other for the unconstrained optimization test functions. But there is no study on constrained optimization test problems. In this study, we aim to show the performance of Sine-Cosine Algorithm on constrained optimization problems. In order to achieve this, we are going to compare the performances by using well-known constrained test functions

Highlights

  • Optimization can be defined as finding the most effective and highest achievable performance under the given limitations

  • All the stochastic algorithms are directly suited to unconstrained optimization problems

  • We aim to show the performance of Sine-Cosine Algorithm on constrained optimization problem and compare results with genetic algorithm and particle swarm optimization algorithm

Read more

Summary

Introduction

Optimization can be defined as finding the most effective and highest achievable performance under the given limitations. A set of values that satisfies all the constraints of an optimization problem creates a feasible solution. 378 – 386 years optimization problems have been tried to solve by classical mathematical methods. Deterministic methods have a great advantage that they find global optima It is impossible to develop one way to solve all the nonlinear problems. Stochastic algorithms can find promising solutions for difficult optimization problems, but there is no guarantee that optimal solutions can be reached all the time. Stochastic algorithms are good at solving most of the real world problems which are nonlinear and multimodal[1]. It is necessary to find the best value of a constraint optimization problem such as resource constraint, time constraint, cost constraint, design constraint according to these conditions

Related Works
Dealing with constraints
Experimental Results
Conclusions and Discussion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call