Abstract

Knowledge graphs, which contains billions of facts, have become a useful resource for many AI-related downstream tasks, such as knowledge inference, personalized search. However, although existing knowledge graphs contain billions of facts, most knowledge graphs are still incomplete due to the inability to manually access all the facts in the world. In order to automatically predict missing triples, researchers have proposed many knowledge graph embedding models for link prediction task. However, most existing models have flaws in handling symmetric relations, as they will get the same constant |r|=1 or r=0 for any symmetric relations. In order to address such flaws, we propose a novel model called InversEF, which can get different representations for symmetric relations by learning inverse function representation of triples. In InversEF, for the complex relations, we explicitly capture the structured information of each entity by using the interaction between entities and relations. Due to the symmetry between the function and its corresponding inverse function, InverseEF can work well on the symmetric and inverse relations. Experimental results show that, InversEF achieves new state-of-the-art performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call