Hyperparameter recommendation via meta-learning relies on the characterization and quality of meta-features. These meta-features provide critical information about the underlying datasets but are often selected manually based on the practitioner’s experience and preference, which can be inefficient and ineffective in many applications. In this paper, we propose a novel hyperparameter recommendation approach that integrates with a Lasso-based multivariate kernel group (KGLasso) model. The developed KGLasso model automatically identifies primary meta-features through model training. By selecting the most explanatory meta-features for a specific meta-learning task, the recommendation performance becomes much more effective. Our KGLasso model builds on a group-wise generalized multivariate Lasso approach. Within this framework, we establish a minimization algorithm using a corresponding auxiliary function, which is mathematically proven to be convergent and robust. As an application, we develop a hyperparameter recommendation system using our built KGLasso model on 120 UCI datasets for the well-known support vector machine (SVM) algorithm. This system efficiently provides competent hyperparameter recommendations for new tasks. Extensive experiments, including comparisons with popular meta-learning baselines and search algorithms, demonstrate the superiority of our proposed approach. Our results highlight the benefits of integrating model learning and feature selection to construct an automated meta-learner for hyperparameter recommendation in meta-learning.
Read full abstract