Abstract

The knowledge graph is constructed to acquire a large amount of structured knowledge that can be read by computers. To effectively complete the knowledge graph, embedding representation models that encode entities and relations in real number tensor space are proposed, such as the TransE model and its extension models, namely, the flexible translation model (FT), dynamic translation model (DT) and manifold-based representation model (OrbitE). These above baselines are proposed to alleviate the problem of inaccurate triple representation in complex relations and have achieved good results. However, with the improvement of the effect, the number of parameters increases. Therefore, an asymmetric knowledge representation learning model in manifold space (MAKR) is proposed in this paper. The position of the golden triple is expanded to the manifold based on the OrbitE model. Then, the embedded representations of the head and tail entities are separately weighted by the corresponding different embedded relations, which are the same manifold space in the same triples instead of the same points. The MAKR alleviates the asymmetry and imbalance of relations and the unsatisfactory precise prediction. Moreover, the time and space complexities of the proposed MAKR are low. Finally, the effectiveness of the MAKR is verified by four datasets: FB15K, WN18, FB13 and WN11. Compared to other baselines, the MAKR has achieved better performance in both triple classification and link prediction tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call