Abstract

Self-attention mechanism is primarily designed to capture the correlation (interaction) between any two objects in a sequence. Inspired by self-attention's success in many NLP tasks, some researchers have employed self-attention in sequential recommendation to refine user representations by capturing the correlations between the historical interacted items of a user. However, the user representations in previous self-attention based models are not flexible enough since the self-attention is only applied on user side, restricting performance improvement. In this paper, we propose a deep recommendation model with feature-level self-attention, namely SAFrec, which exhibits enhanced recommendation performance mainly due to its two advantages. The first one is that SAFrec employs self-attention mechanism on user side and item side simultaneously, to co-refine user representations and item representations. The second one is that, SAFrec leverages item features distilled from open knowledge graphs or websites, to represent users and items on fine-grained level (feature-level). Thus the correlations between users and items are discovered sufficiently. The extensive experiments conducted over two real datasets (NetEase music and Book-Crossing) not only demonstrate SAFrec's superiority on top-n recommendation over the state-of-the-art deep recommendation models, but also validate the significance of incorporating self-attention mechanism and feature-level representations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call