Logical query reasoning over knowledge graphs (KGs) is an important task for querying some information upon specified conditions. Despite recent advancements, existing methods typically focus on the inherent structure of logical queries and fail to capture the commonality among entities and relations, resulting in cascading errors during multi-hop inference. To mitigate this issue, we resort to inferring relations’ domain constraints based on the commonality of their connected entities implicitly. Specifically, to capture the domain constraints of relations, we treat the set of relations emitted by an entity as its implicit concept information and derive a relation’s domain constraint by aggregating the implicit concept information of its head entities. Employing a geometric-based embedding strategy, we enrich the representations of entities in the query with their implicit concept information. Additionally, we design a straightforward yet effective curriculum learning strategy to refine its reasoning skills. Notably, our model can be integrated into any existing query embedding-based logical query reasoning methods in a plug-and-play manner, enhancing their understanding of the entities as well as relations in queries. Experiments on three widely used datasets show that our model can achieve comparable outcomes and improve the performance of existing logical query reasoning models. Particularly, as a plug-in, it achieves an absolute improvement of the maximum 8.4% Hits@3 compared to the original model on the FB15k dataset, and it surpasses the former state-of-the-art plug-and-play logical query reasoning model in most scenes, exceeding it by up to 2.1% average Hits@3 results.
Read full abstract