Abstract

Recurrent neural networks (RNN) based recommendation algorithms have been introduced recently as sequence information plays an increasingly important role when modeling user preferences. However, these methods have numerous limitations: they usually give undue importance to sequential changes and place insufficient emphasis on the correlation between adjacent items; additionally, they typically ignore the impacts of context information. To address these issues, we propose an attention-based context-aware sequential recommendation model using Gated Recurrent Unit (GRU), abbreviated as ACA-GRU. First, we consider the impact of context information on recommendations and classify them into four categories, including input context, correlation context, static interest context, and transition context. Then, by redefining the update and reset gate of the GRU unit, we calculate the global sequential state transition of the RNN determined by these contexts, to model the dynamics of user interest. Finally, by leveraging the attention mechanism in the correlation context, the model is able to distinguish the importance of each item in the rating sequence. The impact of outliers that are less informative or less predictive decreases or is ignored. Experimental results indicate that ACA-GRU outperforms state-of-the-art context-aware models as well as sequence recommendation algorithms, demonstrating the effectiveness of the proposed model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call