Abstract

Knowledge Graph Completion (KGC) has been an active research topic in recent years, which is the task of predicting missing links based on known triples of knowledge graphs. Some recent work has shown that graph neural networks (GNNs) using graph structure can perform well on KGC. These models learn information from entities and relations within the subject’s neighborhood and update the representation through a message passing mechanism. However, existing GNN models rarely include the modeling of relational information, and they tend to represent and learn nodes through complex networks, ignoring the underlying semantic information between relations. In this work, we propose a global relationship-assisted graph attention network. It not only models entities but also builds directed graph structures and updates the representation of relations between different relations. Specifically, the strongly correlated neighboring relations are identified for aggregation by an attention function based on the information and spatial domains. We also use a learnable nonlinear function to activate the attention values, allowing the model to adaptively aggregate information. Experiments show that GRA-GAT has a very advanced performance on link prediction tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call