Abstract

Recently, deep learning has dominated the recommender system, as it is able to effectively capture nonlinear and nontrivial user–item relationships, and perform complex nonlinear transformations. However, there are still some issues with respects to the existing methods. Firstly, they always treat user–item interactions independently, and may fail to cover more complex and hidden information that is inherently implicit in the local neighborhood surrounding an interaction sample. Secondly, by quantifying the dependence degree of user–item sequences, it demonstrates that both short-term and long-term dependent behavioral patterns co-exist. Unfortunately, typical deep learning methods might be problematic when coping with very long-term sequential dependencies. To address these issues, we propose a novel unified neural collaborative recommendation algorithm that capitalizes on memory networks for learning attention embedding from implicit interaction (NCRAE). Particularly, the attention is capable of learning the relative importance of different users and items from user–item interaction sequences, which provides a better solution for concentrating on inputs and helps to better memorize long-term sequential dependencies. Extensive experiments on three real-world datasets show significant improvements of our proposed NCRAE algorithm over the competitive methods. Empirical evidence shows that using memory networks for learning attention embeddings of users’ implicit interaction yields better recommendation performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call