Knowledge graphs serve as a pivotal framework for the structured representation of information regarding entities and relations. However, in the real world, these knowledge graphs are often incomplete and harboring missing facts. Knowledge graph completion (KGC) has emerged as a central research focus, entailing the automated prediction of these missing facts and garnering substantial scholarly attention in recent years. Text-based knowledge graph embedding methods have demonstrated considerable potential for tackling the challenges associated with KGC by employing pre-trained language models. However, their limitation lies in the lack of logical features, which constrains the efficacy of capturing intricate patterns within knowledge graphs. This paper proposed SimRE, a straightforward contrastive learning framework augmented with soft logic rules. SimRE introduces a self-supervised framework that leverages the input rule bodies to predict the corresponding rule heads through a contrastive objective. We introduced two rule sampling techniques to enhance the efficiency and accuracy of the model: in-batch rule negatives and pre-batch rule negatives. SimRE employs a simple method for integrating logical features with the text-based model. The experimental results on benchmark datasets demonstrate that the proposed approach outperforms state-of-the-art methods.
Read full abstract