Abstract

More Efficient Bayesian Optimization Through the Use of Common Random Numbers Bayesian optimization is a powerful tool for expensive stochastic black-box optimization problems, such as simulation-based optimization or hyperparameter tuning in machine learning systems. In “Bayesian Optimization Allowing for Common Random Numbers,” Pearce, Poloczek, and Branke show how explicitly modeling the random seed in the Gaussian process surrogate model allows Bayesian optimization to exploit the structure in the noise and benefit from variance reduction provided by common random numbers. The proposed knowledge gradient with common random numbers acquisition function iteratively determines a combination of input and random seed to evaluate the objective. It automatically trades off reusing old seeds to benefit from the variance reduction through common random numbers and querying new seeds to avoid bias because of a small number of seeds. The proposed algorithm is analyzed theoretically and empirically shows superior performance compared with previous approaches on various test problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call