Abstract

We describe the optimization algorithm implemented in the open-source derivative-free solver RBFOpt. The algorithm is based on the radial basis function method of Gutmann and the metric stochastic response surface method of Regis and Shoemaker. We propose several modifications aimed at generalizing and improving these two algorithms: (i) the use of an extended space to represent categorical variables in unary encoding; (ii) a refinement phase to locally improve a candidate solution; (iii) interpolation models without the unisolvence condition, to both help deal with categorical variables, and initiate the optimization before a uniquely determined model is possible; (iv) a master-worker framework to allow asynchronous objective function evaluations in parallel. Numerical experiments show the effectiveness of these ideas.

Highlights

  • An optimization problem without any structural information on the objective function or the constraints, but for which we have the ability to evaluate them at given points, is called a black-box problem

  • The area of derivative-free optimization is dedicated to the study of optimization algorithms that do not rely on computing the partial derivatives of the objective function, and it is naturally applied to black-box problems

  • The algorithm uses the surrogate model to determine the point at which the objective function should be evaluated; this decision is based on criteria first introduced in [16, 33], together with the modifications discussed in [10]. We generalize these approaches in multiple ways, the most notable of which are: (i) We introduce a surrogate model defined in an extended space, mapping categorical variables to their unary encoding, and showing that all steps of the optimization algorithm can be performed in a natural way in either the original or the extended space

Read more

Summary

Introduction

An optimization problem without any structural information on the objective function or the constraints, but for which we have the ability to evaluate them at given points, is called a black-box problem. This paper discusses the implementation of a global derivative-free optimization algorithm that is aimed at black-box problems with expensive objective function evaluations. We remark that RBFOpt is designed for deterministic black-box optimization problems, rather than hyperparameter optimization problems where the result of each objective evaluation is typically a sample from a random variable; we can use RBFOpt by fixing the dataset and the random seed used to train the classifier, thereby making the objective function deterministic This runs the risk of overfitting, as RBFOpt only observes one realization of a generalization error estimator, but in practice it can be an acceptable tradeoff.

Surrogate models with radial basis functions
Optimization with categorical variables in extended space
Description of the optimization algorithm
Solution of linear systems and non-unique interpolants
Determining the next point
Gutmann’s RBF algorithm
MSRSM algorithm
Solution of the search problems
Repairing numerical errors
Automatic model selection
Parallel optimizer
Computational experiments
Test instances
Comparison of algorithmic variants
Categorical variables: original versus extended space
Comparison with existing open-source derivative-free solvers
Parallel optimization
Application to hyperparameter optimization
Findings
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.