Abstract

A knowledge graph (KG) contains a large amount of well-structured external triple information that can effectively solve the problems of poor interpretability in collaborative filtering. Recently, recommendation system (RS) models relying on graph neural networks (GNNs) have been widely developed, but the increase of GNN layers inevitably leads to over-smoothing problems. Meanwhile, most of the current KG-based negative sampling strategies randomly collect negative samples from unobserved data to train RS models. However, these strategies are insufficient to generate negative samples reflecting genuine user demands. To overcome these obstacles, we design a model called Knowledge Graph Residual Negative Sampling Recommendation (KGRNS), which utilizes residual connections and pooling operation to alleviate the over-smoothing problem, and generate high-quality negative samples by negative sampling. Specifically, we devise residual connections on each output layer of the GNN and then utilize sum pooling operation to mitigate the effects of the over-smoothing problem on the model. In addition, to generate high-quality negative samples, we create a gated strategy to mix the knowledge of both positive and negative samples to generate synthetic negative samples and then select the virtual negative sample that is closest to the positive ones through a theoretically backed hard negative sample select strategy. We conducted broad experiments on three datasets. The experimental results showed that KGRNS performed considerable enhancements over state-of-the-art methods. Ablation studies validated the effectiveness of each part of the KGRNS.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call