Abstract
Knowledge graphs (KGs) are intrinsically incomplete. Knowledge representation learning methods deal with embedding components of a knowledge graph (i.e., entities and relations) into a low-dimensional vector space with the aim of predicting the missing links. The KG Embedding models use gradient descent-based optimization techniques to optimize their parameters, but they suffer from late convergence, making them inefficient for dynamic knowledge graphs. Moreover, generating and modeling negative samples has a great impact on the accuracy of the prediction model.To address these challenges, we propose a correlation-based knowledge representation learning (CKRL) method. We utilize three semantic spaces, namely entity, relation, and canonical spaces. The parameter values of these spaces, including transformation matrices and embedding of KG components, are determined by defining four optimization problems. Adapting least-squares and Lagrange multiplier techniques, these optimization problems are solved by time-efficient direct methods. To model positive and negative samples, we use two distinct relation spaces. Negative samples are generated by defining the concept of relational patterns used to measure the similarity between positive and negative samples. Evaluation results on link prediction task and training time analysis demonstrate the superiority of the proposed method in comparison with the state-of-the-art methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.