Abstract

Cold-start problem has been recognized as the most crucial challenge in recommender systems. Many recommendation algorithms work well when lots of preference information is available but start to degrade in cold-start settings. Inspired by the spirit of meta-learning, we identified that the appeal of cold-start problems and the superiority of meta-learning are congruous where meta-learning aims to learn a model from a small set of labeled examples (users' consuming history), and then this model can be quickly generalized to new tasks (recommendation for new users or new items). Therefore, faced with the extreme cold-start scenario, we proposed a meta-learning embedding ensemble (ML2E) recommendation algorithm to forecast new users' preference and generate desirable initial embedding for new items. It's worth pointing out that the training process of ML2E only uses few first-order gradient information, and ML2E not only has the incremental mining ability for different mini-batches on the same task but also the generalization ability for different tasks. Finally, we validated ML2E on two benchmark datasets, experimental results showed that our algorithm has significantly improved recommendation metric in comparison with three existing baselines.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.