Abstract

AbstractThe representation learning of knowledge graph refers to embedding entities and relations in knowledge graph into a low-dimensional dense vector space. Existing knowledge graph embedding models mostly chose Euclidean Space as their vector space and consider each fact triple in knowledge graph independently. However, Euclidean Space is unable to represent knowledge effectively due to its strict constraints and mathematical expression. Besides, entities in knowledge graph are not isolated, while these models ignore the association between entities. To solve the problems above, we proposed a text-enhanced knowledge graph representation learning model in Hyperbolic Space. We utilize rich semantic information of entity description by using Transformer Encoder to enhance the ability of knowledge representation. Besides, we embed entities and relations into Hyperbolic Space, which can better capture hierarchical information of the knowledge graph. Experiments on benchmark dataset show that our method achieves better performance compared with other state-of-art methods.KeywordsKnowledge graphRepresentation learningEmbedding modelHyperbolic space

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call