Abstract

This paper focuses on a subclass of box-constrained, non-linear optimization problems. We are particularly concerned with settings where gradient information is unreliable, or too costly to calculate, and the function evaluations themselves are very costly. This encourages the use of derivative free optimization methods, and especially a subclass of these referred to as direct search methods. The thrust of our investigation is twofold. First, we implement and evaluate a number of traditional direct search methods according to the premise that they should be suitable as local optimizers when used in a metaheuristic framework. Second, we introduce a new direct search method, based on Scatter Search, designed to remedy the lack of a good derivative free method for solving problems of high dimensions. Our new direct search method has convergence properties comparable to those of existing methods in addition to being able to solve larger problems more effectively.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call