Abstract

Extra information, such as hierarchical entity types, entity descriptions or some text corpus are recently used to enhance Knowledge Graph Completion (KGC). A typical task in this setting is building entities’ description information into some embedding models. Existing approaches under this task usually use simple embedding models, which have difficulty in handling the complex structures of the knowledge graphs. These models are also limited in the way where description representation is combined with structure representation, which requires an impractical large set of weight parameters increasing in proportion to the number of entities in the knowledge graph. This paper aims at developing more effective embedding models that jointly represent the structure information of the knowledge base with the description of entities. We propose more principled approaches named Dimensional Attentive Combination (DAC) for the composition of structure representation and description representation with fixed-size parameters independent of entity amount, and the composition builds upon more powerful knowledge graph embedding models. The proposed model significantly reduces the weight parameters and can extend to KGs with a large set of entities or involving sparse data. Experimental comparison on link prediction and relation prediction shows that our approaches, even under a simple description-encoding model, improve upon the baselines by a significant margin.

Highlights

  • As an important resource of building applications for information retrieval, knowledge graphs (KGs) have been used in many related tasks, such as named entity linking [1], relation extraction [2], question answering [3]

  • We investigate adopting an attention mechanism [26] to combine the structure representation and the description representation

  • We find that the conventional attention mechanism using scalar attention weight appears to offer insufficient expressiveness for composing the structure representation and the description representation

Read more

Summary

Introduction

As an important resource of building applications for information retrieval, knowledge graphs (KGs) have been used in many related tasks, such as named entity linking [1], relation extraction [2], question answering [3]. Large-scale KGs have been built over recent years and have attracted intense research efforts. These KGs are often incomplete, namely, a large amount of knowledge is missing. This makes KG completion an important task in KG research. Among various models for KG completion, KG embedding, or learning a distributed KG representation, has demonstrated great power over the past years [4]–[21]. When the geometric structure of the embedding space is learned, the inference of missing entities or relations is possible

Objectives
Methods
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call