Abstract

Recent recommendation systems have achieved good results by applying Graph Neural Network (GNN) to user-item interaction graphs. However, these recommendation systems can only handle structured interaction data and cannot handle unstructured review text data well. Based on the user-item interaction graph, combining review text can effectively solve the problem of data sparsity and improve recommendation quality. Most of the current recommendation methods combining review texts stitch the data from different modalities, leading to insufficient interactions and degrading the recommendations’ performance. A model called RTN-GNNR to fuse Review Text feature and Node feature for Graph Neural Network Recommendation is proposed to solve these problems and get better item recommendations. RTN-GNNR consists of four modules. The review text feature extraction module proposes a Bi-directional Gated Recurrent Unit (Bi-GRU) text analysis method that combines Bidirectional Encoder Representation from Transformers (BERT) and attention mechanism to enable the model to focus on more valuable reviews. The node feature extraction module proposes a GNN combined with the attention mechanism for the interactive node extraction method, which enables the model to have better higher-order feature extraction capability. The feature fusion module proposes the method of tandem Factorization Machine (FM) and Multilayer Perceptron (MLP) to realize interactive learning among multi-source features. The prediction module inner-products the fused higher-order features to achieve recommendation effect. We conducted experiments on five publicly available datasets from Amazon, showing that RTN-GNNR outperforms state-of-the-art personalized recommendation methods in both RMSE and MSE, especially in the sparser two datasets. The effectiveness of each module of the model is also demonstrated by a comparison of the ablation experiments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call