Abstract
Many graph embedding approaches have been proposed for knowledge graph completion via link prediction. Among those, translating embedding approaches enjoy the advantages of light-weight structure, high efficiency and great interpretability. Especially when extended to complex vector space, they show the capability in handling various relation patterns including symmetry, antisymmetry, inversion and composition. However, previous translating embedding approaches defined in complex vector space suffer from two main issues: 1) representing and modeling capacities of the model are limited by the translation function with rigorous multiplication of two complex numbers; and 2) embedding ambiguity caused by one-to-many relations is not explicitly alleviated. In this paper, we propose a relation-adaptive translation function built upon a novel weighted product in complex space, where the weights are learnable, relation-specific and independent to embedding size. The translation function only requires eight more scalar parameters each relation, but improves expressive power and alleviates embedding ambiguity problem. Based on the function, we then present our Relation-adaptive translating Embedding (RatE) approach to score each graph triple. Moreover, a novel negative sampling method is proposed to utilize both prior knowledge and self-adversarial learning for effective optimization. Experiments verify RatE achieves state-of-the-art performance on four link prediction benchmarks.
Highlights
A knowledge graph refers to a collection of interlinked entities, which is usually formatted as a set of triples
We propose a trans-based graph embedding approach with a novel relation-adaptive translation function in complex vector space, which achieves a better trade-off between interpretability and representing capacity than previous approaches
We study a novel trans-based graph embedding approach, called Relation-adaptive translating Embedding (RatE), for knowledge graph completion
Summary
A knowledge graph refers to a collection of interlinked entities, which is usually formatted as a set of triples. How to auto-complete knowledge graphs becomes a popular problem in both research and industry communities. For this purpose, many light-weight graph embedding approaches (Bordes et al, 2013; Yang et al, 2015; Sun et al, 2019) have been proposed. Unlike costly graph neural networks (GNNs) (Schlichtkrull et al, 2018), these approaches use low-dimensional embeddings to represent the entities and relations, and capture their relationships via semantic matching or geometric distance. The approaches with geometric distance, e.g., TransE (Bordes et al, 2013) and RotatE (Sun et al, 2019), first apply a translation function to head entity and relation for a new embedding in latent space and
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.