Abstract

Session-based recommendation (SBR) recommends the next clicked item to anonymous users. Recent studies utilize graph attention networks for modeling. There exist various problems in these algorithms: First, most of these methods only consider the long-term interests of users, ignoring the short-term interest transfer of users; second, when the graph attention mechanism obtains item weights, irrelevant items are assigned weights, which reduces the weights of related items, resulting in insufficient weight discretization. Third, the last action is taken as the user's final preference, however, this assumption does not necessarily correspond to the user's actual interest. This paper proposes a long- and short-term interest-attention aware network (LSIAN) model. First, we propose a novel time-aware attention mechanism that learns users' short-term interests by considering context and time interval, and utilizes average pooling to represent users' long-term interests. Next, we introduce a context-based adaptive sparse attention mechanism to discover irrelevant items, down-weight irrelevant items through a dynamic loop, and reserve higher weights for related items. Furthermore, we introduce a gated fusion method adaptively integrate the user's long-term and short-term preferences, thereby alleviating the problem of excessive weighting of the last term. Experiments on two public datasets show that LSIAN outperforms state-of-the-art algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call