Abstract

Knowledge graph completion (KGC) aims to infer missing links between entities based on the observed ones. Current KGC methods primarily focus on KG embedding models, which project entities and relations as low-dimensional vectors. Recently, the combination of textual information with graph neural network models has drawn extensive attention due to their superiority in utilizing topological structures, benefiting from the message passing mechanism, and their effectiveness in supplementing structural information. Nevertheless, previous methods suffer from the following two limitations. First, they always treat the textual information as an independent instance to enhance the corresponding entities, without considering the global semantic within the KG. Second, Graph Neural Networks (GNNs) typically assume that the neighbors of a node are independent of each other, ignoring the possible interactions between them. To eliminate these limitations, we creatively propose a KGC method called GS-InGAT (Interaction Graph ATtention Network with Global Semantic). Concretely, we utilize a semantic graph to model the semantic relationships and obtain the global semantic representations for entities based on it. Furthermore, we introduce an efficient Interaction Graph ATtention network (InGAT) that can simultaneously capture both the interaction and local information of entities, which can be fused to generate structural representations. Finally, we feed the combination of the semantic and structural representations, along with relation representations, into the decoder to score triples. Experimental results demonstrate that the GS-InGAT consistently attains comparable performance on benchmark datasets, verifying the effectiveness of considering the global semantic and interactions between neighbors.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call