Abstract
Click-through rate prediction is an important task in commercial recommender systems and it aims to predict the probability of a user clicking on an item. The event of a user clicking on an item is accompanied by several user and item features. As modelling the feature interactions effectively can lead to better predictions, it has been the focus of many recent approaches including deep learning-based models. However, the existing approaches either (i) model all possible feature interactions for a given order, or (ii) manually select which feature interactions to model. Besides, they use the same network structure or function to model all the feature interactions while ignoring the difference of complexity among them. To address these issues, we propose a neural architecture search based approach called AutoFeature that automatically finds essential feature interactions and selects an appropriate structure to model each of these interactions. Specifically, we first define a flexible architecture search space for the CTR prediction task which covers many popular designs such as PIN, PNN and DeepFM, etc., and enables higher-order interactions. Then we propose an efficient neural architecture search algorithm that recursively refines the search space by partitioning it into several subspaces and samples from higher quality ones. Extensive experiments on multiple CTR prediction benchmarks show the superiority of our AutoFeature over the state-of-the-art baselines. In addition, our experiments show that the learned architectures use fewer flops/parameters and hence can efficiently incorporate higher-order feature interactions. This further boosts the performance. Finally, we show that AutoFeature can find meaningful feature interactions.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have