Abstract

Key challenges of Bayesian optimization in high dimensions are both learning the response surface and optimizing an acquisition function. The acquisition function selects a new point to evaluate the black-box function. Both challenges can be addressed by making simplifying assumptions, such as additivity or intrinsic lower dimensionality of the expensive objective. In this article, we exploit the effective lower dimensionality with axis-aligned projections and optimize on a partitioning of the input space. Axis-aligned projections introduce a multiplicity of outputs for a single input that we refer to as inconsistency. We model inconsistencies with a Gaussian process (GP) derived from quantile regression. We show that the quantile GP and the partitioning of the input space increases data-efficiency. In particular, by modeling only a quantile function, we overcome issues of GP hyper-parameter learning in the presence of inconsistencies.

Highlights

  • Studies in robotics, machine learning, software development, recommendation systems and medicine are governed by design and parameter choices

  • This experiment analyzes the performance of the quantile-GP model (QGP) BayesOpt in a highdimensional search space, when we no longer make any assumptions on additive decomposability of the objective

  • We proposed a framework for scaling Bayesian optimization to high dimensions by using axis-aligned projections

Read more

Summary

Introduction

Machine learning, software development, recommendation systems and medicine are governed by design and parameter choices. The exploration/exploitation trade-off to obtain a globally optimal solution is taken care of by an acquisition function This approach has proven successful in various fields such as movie recommendation systems [21], parameter estimation of biological models [23] and automatic algorithm configuration [10]. Each model is trained with a separate dataset to address the problem of inconsistencies introduced by axis-aligned projections To amend this issue Ulmasov et al [23] adopts multiple separate subsets of experiments resulting in a data-costly strategy.

Problem setting
Gaussian processes
Bayesian optimization
High-dimensional Bayesian optimization with projections
Quantile GP regression
Experiments
Sensitivity analysis
Additive high-dimensional objectives
Non-additive high-dimensional objective
Rotated high-dimensional objective
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call