Abstract

AbstractAs a significant application of Big Data, recommender system can effectively solve information overload. The user's behavior sequence forms massive data and has excellent mining value. Sequential recommendation is to extract user's features in massive sequential data and predict the next interaction based on the user's recent temporal behavior. Currently, recurrent neural networks (RNN) and Graph Neural Networks (GNN) take on the role of item embedding in sequential recommendation and have shown adequate performance. However, such RNN based model and GNN based model cannot deeply mine the complex behavior sequence and neglect user preference like rating information. Inspired by the popular Transformer, we adopt the Transformer encoder layer to process sequence and represent item embedding by multi-head attention. Meanwhile, rating information is integrated into weight calculation when we represent the user preference with self-attention. Weight with rating not only retains the structural information of sequence but also combines the user's preferences. What's more, we consider global and local preferences to formulate hybrid performance and make recommendations in Top-N. For persuasiveness, we conduct experiments on large real-world datasets, and our model performs better in most cases on two datasets compared to state-of-the-art methods.KeywordsRecommender systemSequential recommendationTransformerAttention mechanismBig data applicationData analytics

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.