Abstract

Knowledge graphs (KGs) are intrinsically incomplete. Knowledge representation learning methods deal with embedding components of a knowledge graph (i.e., entities and relations) into a low-dimensional vector space with the aim of predicting the missing links. The KG Embedding models use gradient descent-based optimization techniques to optimize their parameters, but they suffer from late convergence, making them inefficient for dynamic knowledge graphs. Moreover, generating and modeling negative samples has a great impact on the accuracy of the prediction model.To address these challenges, we propose a correlation-based knowledge representation learning (CKRL) method. We utilize three semantic spaces, namely entity, relation, and canonical spaces. The parameter values of these spaces, including transformation matrices and embedding of KG components, are determined by defining four optimization problems. Adapting least-squares and Lagrange multiplier techniques, these optimization problems are solved by time-efficient direct methods. To model positive and negative samples, we use two distinct relation spaces. Negative samples are generated by defining the concept of relational patterns used to measure the similarity between positive and negative samples. Evaluation results on link prediction task and training time analysis demonstrate the superiority of the proposed method in comparison with the state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call