Abstract

Knowledge graphs usually have many missing links, and predicting the relationships between entities has become a hot research topic in recent years. Knowledge graph embedding research maps entities and relations to a low-dimensional continuous space representation to predict links between entities. The present research shows that the key to the knowledge graph embedding approach is the design of scoring functions. According to the scoring function, knowledge graph embedding methods can be classified into dot product models and distance models. We find that the triple scores obtained using the dot product model or the distance model were unbounded, which leads to large variance. In this paper, we propose RotatE Cosine Similarity (RoCS), a method to compute the joint cosine similarity of complex vectors as a scoring function to make the triple scores bounded. Our approach combines the rotational properties of the complex vector embedding model RotatE to model complex relational patterns. The experimental results demonstrate that the newly introduced RoCS yields substantial enhancements compared to RotatE across various knowledge graph benchmarks, improving up to 4.0% in hits at 1 (Hits@1) on WN18RR and improving up to 3.3% in Hits@1 on FB15K-237. Meanwhile, our method achieves some new state-of-the-art (SOTA), including Hits@3 of 95.6%, Hits@10 of 96.4% on WN18, and mean reciprocal rank (MRR) of 48.9% and Hits@1 of 44.5% on WN18RR.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.