Abstract

With the ever-growing amount of online information, recommender systems (RSs) act as information filtering tools and are widely used in various e-commerce platforms. Recommender methods generally adopt only one type of behavior data for single-task learning. Multitask learning is commonly used to simultaneously address multiple tasks utilizing as much information as possible. We propose a knowledge distillation-enhanced shared-bottom model for recommender multitask learning. The postview click-through rate, postview conversion rate and average transaction value are learned together to determine the final gross merchandise volume. First, a mixture of experts with gate networks is used as a shared bottom to learn task-specific representations for all tasks, while a tree distillation framework is designed as an expert for better feature selection. Then, we design a neural factorization machine as a task-specific prediction network to estimate individual goals. Finally, an objective function is designed based on task outputs to optimize the gross merchandise volume. Our proposed framework can enhance recommender preferences while avoiding generating a large number of parameters.This model serves the e-commerce scenario, with the following specific use scenario: building personalized GMV predictions for small scenarios on e-commerce platforms at low cost, which is a 0 to 1 process that is often highly rewarding. Ultimately, we propose to improve the reliability of e-commerce scenarios.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call