Abstract

Nowadays, representing entities and relations in a machine understandable way through Knowledge graph embedding (KGE) has been proven as an effective approach for predicting missing links in knowledge graphs (KGs). Mainly, the success of such approach depends on the model ability to infer the patterns of the relations. Indeed, most of the existing KG models highly focus on modeling simple relation patterns such as symmetry, anti-symmetry, inversion, and composition. However, there are few models in the literature that take into consideration the modeling of complex relation patterns like 1-[Formula: see text], [Formula: see text]-1 and [Formula: see text]-[Formula: see text], which are common in real-world applications. To overcome this challenge, this paper presents a new KGE model called KEMA[Formula: see text], i.e. KGE using Modular Arithmetic, that relies on the combination of projection and modular arithmetic. The main idea behind KEMA[Formula: see text] is to project the entities of a relation to represent the relations of a KG, before applying a modular arithmetic over it. Thus, KEMA[Formula: see text] will be able to infer all simple and complex relation patterns for any KGE applications. Through extensive experiments on several datasets, we demonstrated the relevance of KEMA[Formula: see text] in terms of effectively representing all model relations in a KG. Simulations on several tested datasets show that KEMA[Formula: see text] obtains the good scores for Mean Rank (MR) and Hits@1 tests. Moreover, KEMA[Formula: see text] obtains the good Hits@1 score compared to the existing models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call