Abstract

<p>Hyperparameter tuning plays a significant role when building a machine learning or a deep learning model. The tuning process aims to find the optimal hyperparameter setting for a model or algorithm from a pre-defined search space of the hyperparameters configurations. Several tuning algorithms have been proposed in recent years and there is scope for improvement in achieving a better exploration-exploitation tradeoff of the search space. In this paper, we present a novel hyperparameter tuning algorithm named adaptive Bayesian contextual hyperband (Adaptive BCHB) that incorporates a new sampling approach to identify best regions of the search space and exploit those configurations that produce minimum validation loss by dynamically updating the threshold in every iteration. The proposed algorithm is assessed using benchmark models and datasets on traditional machine learning tasks. The proposed Adaptive BCHB algorithm shows a significant improvement in terms of accuracy and computational time for different types of hyperparameters when compared with state-of-the-art tuning algorithms.</p>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call