Abstract

Meta-learning has been proven effective in tackling the cold-start problem of recommendation systems. Most work in this line adopts the meta-optimization idea that learns global knowledge to initialize the base recommender and adapts it for unseen recommendation tasks. Typically, the process is non-adaptive in the sense that both the initialization and model fitting are not task-specific. Such non-adaptive approaches do not work well for the strict cold-start problem, where historical interaction data of the new users is scarce or not available at all. In this paper, we propose a novel adaptive meta-optimization approach, called Adaptive Meta-Optimization (AdaMO), to address the problem. AdaMO is fully adaptive, namely, it simultaneously customizes the initialization and model fitting according to task-specific information. During the cold-start phase, it aggregates task-specific information with transferred knowledge from all historical tasks to tune the initialization of the base model. In the subsequent warm-up phase, AdaMO mines the adaptive transition information from the behavior patterns and parameter status to effectively fit the model using transitional learning. By making a sophisticated balance between shared knowledge and task-specific information, AdaMO outperforms the state-of-the-art methods in various cold-start scenarios.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call