Abstract

Session-based recommendation, which aims to match user needs with rich resources based on anonymous sessions, nowadays plays a critical role in various online platforms (e.g., media streaming sites, search and e-commerce). Existing recommendation algorithms usually model a session as a sequence or a session graph to model transitions between items. Despite their effectiveness, we would argue that the performance of these methods is still flawed: (1) Using only fixed session item embedding without considering the diversity of users’ interests and target items. (2) For user’s long-term interest, the difficulty of capturing the different priorities for different items accurately. To tackle these defects, we propose a novel model which leverages both the target attentive network and self-attention network to improve the graph-neural-network (GNN)-based recommender. In our model, we first model user’s interaction sequences as session graphs which serves as the input of the GNN, and each node vector involved in session graph can be obtained via the GNN. Next, target attentive network can activates different user interests corresponding to varied target items (i.e., the session embedding learned varies with different target items), which can reveal the relevance between users’ interests and target items. At last, after applying the self-attention mechanism, the different priorities for different items can be captured to improve the precision of the long-term session representation. By using a hybrid of long-term and short-term session representation, we can capture users’ comprehensive interests at multiple levels. Extensive experiments demonstrate the effectiveness of our algorithm on two real-world datasets for session-based recommendation.

Highlights

  • In the scenario based on Internet e-commerce, streaming media and other recommendations, users interact with related items in a chronological order

  • As there has been a revival of neural networks in recent years, session-based recommendation, an effective approach that matches user needs with rich resources based on anonymous sessions to alleviate information overload effectively, has attracted attention from industry and academia, which can help satisfy diverse service demands and reduce information overload

  • To overcome the limitations mentioned above, we argue that incorporating specific user interests related to a target item and accurately capturing the different priorities for different items can improve the user’s session representation effectively

Read more

Summary

Introduction

In the scenario based on Internet e-commerce, streaming media and other recommendations, users interact with related items in a chronological order. Session sequences are constructed into a session graph at first, and the node vectors in the session graph are obtained via the graph neural network, and the session representation for users is obtained and integrates long-term and short-term interest preference This method expresses the complex transitions between items, it still ignores specific user interests related to a target item and accurate user long-term preference. Yu et al [8] proposed a novel target attentive graph neural network to implement a session-based recommendation task which is unable to accurately capture users’ long-term preferences. To improve the representation of session sequences, we propose a novel model combining self-attention network with target attentive network, which can capture specific user interests related to a target item and the accurate priorities for different items. To model the representation of session sequences, we combine short-term user preferences with long-term user preferences based on attention-enhanced GNN-based recommender

Classical Recommendation Methods
Deep-Neural-Network-Based Recommendation Methods
Preliminaries
DynamicAlthough
Construction for Session Graphs and Session Matrix
A S is defined as the combination matrix of
Learning Node Vectors on Session Graphs
Self-Attention Layers Construction
Self-Attention Layer
Point-Wise Feed-Forward Network
Multi-layer Self-Attention
Hybrid Session Embeddings Construction
Prediction Layer
Experiments
Datasets
Baseline Algorithms
Parameter Setting
Evaluation Metrics
Observations about Our Model
Other Observations
Impact of the Number of Self-attention Blocks
Impact of Varying Session Representations
Conclusions and Future Work
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call