Abstract

Knowledge Graphs, fundamental to intelligent applications, are increasingly critical in various domains, enhancing tasks like precise searching and personalized recommendation. Effectively representing entities and relationships in these graphs is key, especially as the Transformer model, despite its representational prowess, faces challenges in adapting to the graph’s structure and complex relations. In this work, we present the Transformer with Relation-pattern Adaptive Contrastive Learning for Knowledge Graph Embedding (TracKGE). Specifically, TracKGE transforms the structural information of the knowledge graph into a sequence format that is more manageable for Transformers. In addition, we employ a relation-pattern adaptive contrastive learning module to capture a richer semantic and complex relationship pattern information of the knowledge graph. Lastly, by introducing a mask node model, it addresses the issue of incomplete information in the knowledge graph, further enhancing the model’s capability to capture implicit relationships within it. To evaluate the performance of our model, we have chosen well-established models as baselines and executed link prediction tasks on four renowned datasets. Our experimental results reveal that our model excels in representing the semantics and intricate structures of Knowledge Graphs. It outperforms other advanced baseline models, showcasing its superior capability in handling complex data representations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call