Abstract

Knowledge graph completion employs existing triples to deduce missing data, thereby enriching and enhancing graph completeness. Recent research has revealed that using hyperbolic representation learning in knowledge graph completion yields superior expressive and generalization capabilities. However, the long-tail problem and the presence of hyperbolic metrics make it challenging to effectively learn low-frequency entities or relations, resulting in embedding space distortion and impacting the original semantic relationships. Therefore, this paper proposes a knowledge graph completion method (Att-CL) that integrates hyperbolic representation learning and contrastive learning. In this approach, knowledge is embedded into a hyperbolic space, and samples with limited hierarchical characteristics and insufficient feature information are enhanced by introducing adversarial noise. The loss function of the embedded samples is backpropagated into embedding vectors, perturbations are adjusted in the gradient direction to promote smoothness and locality, and hyperparameters are introduced for fine-tuning the adversarial strength in the construction of adversarial samples for data augmentation to enhance model robustness. To mitigate data distortion due to hyperbolic metrics, a penalty term is introduced in the contrastive loss function to control the distances of the embedding vectors from the origin, thereby reducing the impact of the metrics and further improving the model's completion ability. Experimental results on the WN18RR and FB15K-237 benchmark datasets demonstrate significant improvements in metrics such as MRR, Hits@1, and Hits@3 compared to traditional knowledge graph completion models, providing ample evidence of the model's effectiveness.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call