Abstract

Knowledge graph embedding, which aims to address the limitation of symbolic representation of knowledge, has become an effective method for many AI downstream tasks, such as relation extraction, question answering. Existing knowledge graph embedding models mainly consider triples individually, and ignore the structural information connected with other entities. However, the connectivity between entities not only provides explicit structural information represented in triples, but also embodies a lot of implicit structure information. In this paper, a new knowledge graph embedding model is proposed, which can capture both the information of relational structure-context and edge structure-context by two-interaction. In addition, in order to model complex relations, we define different score function for different relation types. Moreover, the four relation connectivity types in knowledge graph (i.e. symmetry/antisymmetry, inversion, and composition) also can be modeled and inferred by StructurE. We evaluate our StructurE for knowledge graph link prediction task. Benefiting from the structural context and the relation-type-specific score function, compared with conventional geometric transformation-based knowledge graph embedding models StructurE achieves state-of-the-art results for link prediction. Moreover, compared with GCN-based models StructurE also achieves state-of-the-art results on more challenging dataset WN18RR which contains more symmetric relations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call