7-days of FREE Audio papers, translation & more with Prime
7-days of FREE Prime access
7-days of FREE Audio papers, translation & more with Prime
7-days of FREE Prime access
https://doi.org/10.1016/j.asoc.2022.109894
Copy DOIJournal: Applied Soft Computing | Publication Date: Nov 30, 2022 |
Citations: 25 |
Sequential recommendation is one of the hot research topics in recent years. Various sequential recommendation models have been proposed, of which Self-Attention (SA)-based models are shown to have state-of-the-art performance. However, most of the existing SA-based sequential recommendation models do not make use of temporal information, i.e., timestamps of user–item interactions, except for an initial attempt (Li et al., 2020). In this paper, we propose a Time-Aware Transformer for Sequential Recommendation (TAT4SRec), an SA-based neural network model which utilizes the temporal information and captures users’ preferences more precisely. TAT4SRec has two salient features: (1) TAT4SRec utilizes an encoder–decoder structure to model timestamps and interacted items separately and this structure appears to be a better way of making use of the temporal information. (2) in the proposed TAT4SRec, two different embedding modules are designed to transform continuous data (timestamps) and discrete data (item IDs) into embedding matrices respectively. Specifically, we propose a window function-based embedding module to preserve the continuous dependency contained in similar timestamps. Finally, extensive experiments demonstrate the effectiveness of the proposed TAT4SRec over various state-of-the-art MC/RNN/SA-based sequential recommendation models under several widely-used metrics. Furthermore, experiments are also performed to show the rationality of the different proposed structures and demonstrate the computation efficiency of TAT4SRec. The promising experimental results make it possible to apply TAT4SRec in various online applications.
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.