Abstract

Multi-relational graph neural networks (GNNs) have found widespread application in tasks involving enhancing knowledge representation and knowledge graph (KG) reasoning. However, existing multi-relational GNNs still face limitations in modeling the exchange of information between predicates. To address these challenges, we introduce Relgraph, a novel KG reasoning framework. This framework introduces relation graphs to explicitly model the interactions between different relations, enabling more comprehensive and accurate handling of representation learning and reasoning tasks on KGs. Furthermore, we design a machine learning algorithm based on the attention mechanism to simultaneously optimize the original graph and its corresponding relation graph. Benchmark and experimental results on large-scale KGs demonstrate that the Relgraph framework improves KG reasoning performance. The framework exhibits a certain degree of versatility and can be seamlessly integrated with various traditional translation models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call