Multi-relational knowledge graph contrastive learning for link prediction

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

Multi-relational knowledge graph contrastive learning for link prediction

Similar Papers
  • Conference Article
  • Cite Count Icon 1
  • 10.1145/3587828.3587830
Improving the Efficiency of Link Prediction on Handling Incomplete Knowledge Graph Using Clustering
  • Feb 23, 2023
  • Fitri Susanti + 2 more

A knowledge graph (KG) is used to store knowledge in the form of connected facts. Facts in KG are represented in the form of a triple (subject, predicate, object) or (head, relation, tail). KG is widely used in question answering, information retrieval, classification, recommender systems, and so on. However, a common problem with KG is incomplete KG. A KG is called incomplete if there is a missing relationship between two entities. An incomplete KG can have an impact on decreasing the accuracy of a task that uses the KG. One solution to the incomplete KG is to use link prediction. Link prediction aims to predict the missing relationship between two entities in a KG. Another problem is that the size of KG is large, consisting of hundreds or millions of entities and relationships. Handling large KG also needs to be considered. Therefore, link prediction on large KG also needs to be considered so that the link prediction process is more efficient. This paper discusses link prediction using embedding to overcome the incomplete KG problem. In addition, it is proposed to use clustering to increase the efficiency of the link prediction process. Clustering is used to group the embedding results. After the embedding results are grouped, scoring and loss function calculations to predict missing links are carried out in groups that are considered appropriate. It is expected that with this grouping, the time of link prediction process can be more efficient because there is no need to check all the vectors in the embedding space.

  • Conference Article
  • Cite Count Icon 3
  • 10.1109/bigdata50022.2020.9378475
Link Prediction in Knowledge Graphs with Numeric Triples Using Clustering
  • Dec 10, 2020
  • Betul Bayrak + 2 more

Knowledge graphs (KG) include large amounts of structured data in many different domains. Knowledge or information is captured by entities and relationships between them in KG. One of the open problems in knowledge graphs area is prediction, that is predicting new relationships or links between the given existing entities in KG. A recent approach in graph-based learning problems is embedding, in which graphs are represented as low-dimensional vectors. Then, it is easier to make link predictions using these vector representations. We also use graph embedding for graph representations. A sub-problem of link prediction in KG is the link prediction in the presence of literal values, and specifically numeric values, on the receiving end of links. This is a harder problem because of the numeric literal values taking arbitrary values. For such entries link prediction models cannot work, because numeric entities are not embedded in the vector space. There are several studies in this area, but they are all complex approaches. In this study, we propose a novel approach for link prediction in KG in the presence of numerical values. To overcome the embedding problem of numeric values, we used a clustering approach for clustering these numerical values in a knowledge graph and then used the clusters for performing link prediction. Then we clustered the numerical values to enhance the prediction rates and evaluated our method on a part of Freebase knowledge graph, which includes entities, relations, and numerical literals. Test results show that a considerable increase in link prediction rate can be achieved in comparison to previous studies.

  • Research Article
  • Cite Count Icon 1
  • 10.1016/j.jbi.2024.104725
Community knowledge graph abstraction for enhanced link prediction: A study on PubMed knowledge graph
  • Sep 10, 2024
  • Journal of Biomedical Informatics
  • Yang Zhao + 4 more

Community knowledge graph abstraction for enhanced link prediction: A study on PubMed knowledge graph

  • Research Article
  • 10.1016/j.jksuci.2024.102181
Improving embedding-based link prediction performance using clustering
  • Sep 13, 2024
  • Journal of King Saud University - Computer and Information Sciences
  • Fitri Susanti + 2 more

Improving embedding-based link prediction performance using clustering

  • Book Chapter
  • 10.3233/ssw240010
TWIG-I: Embedding-Free Link Prediction and Cross-KG Transfer Learning Using a Small Neural Architecture
  • Sep 11, 2024
  • Jeffrey Sardina + 3 more

Knowledge Graphs (KGs) are relational knowledge bases that represent facts as a set of labelled nodes and the labelled relations between them. Their machine learning counterpart, Knowledge Graph Embeddings (KGEs), learn to predict new facts based on the data contained in a KG – the so-called link prediction task. To date, almost all forms of link prediction for KGs rely on some form of embedding model, and KGEs hold state-of-the-art status for link prediction. In this paper, we present TWIG-I (Topologically-Weighted Intelligence Generation for Inference), a novel link prediction system that can represent the features of a KG in latent space without using node or edge embeddings. TWIG-I shows mixed performance relative to state-of-the-art KGE models – at times exceeding or falling short of baseline performance. However, unlike KGEs, TWIG-I can be natively used for transfer learning across distinct KGs. We show that using transfer learning with TWIG-I can lead to increases in performance in some cases both over KGE baselines and over TWIG-I models trained without finetuning. While these results are still mixed, TWIG-I clearly demonstrates that structural features are sufficient to solve the link prediction task in the absence of embeddings. Finally, TWIG-I opens up cross-KG transfer learning as a new direction in link prediction research and application.

  • Book Chapter
  • Cite Count Icon 2
  • 10.1007/978-3-030-93413-2_44
Online Updates of Knowledge Graph Embedding
  • Jan 1, 2022
  • Luo Fei + 2 more

Complex networks can be modeled as knowledge graphs (KGs) with nodes and edges denoting entities and relations among those entities, respectively. A knowledge graph embedding assigns to each node and edge in a KG a low-dimensional semantic vector such that the original structure and relations in the KG are approximately preserved in these learned semantic vectors. KG embeddings support downstream applications such as KG completion, classification, entity resolution, link prediction, question answering, and recommendation. In the real world, KGs are dynamic and evolve over time. State-of-the-art KG embedding models deal with static KGs. To support dynamic updates (even local), they must be retrained on the whole KG from scratch, which is inefficient. To this end, we propose a new context-aware Online Updates of Knowledge Graph Embedding (OUKE) method, which supports embedding updates in an online manner. OUKE learns two different vectors for each node and edge, i.e., knowledge embedding and context embedding. This strategy effectively limits the impacts of a local update in a smaller region, so that OUKE is able to efficiently update the KG embedding. Experiments on the link prediction in dynamic KGs demonstrate both effectiveness and efficiency of our solution.KeywordsKnowledge graphsEmbeddingDynamic updates

  • Research Article
  • Cite Count Icon 7
  • 10.3233/sw-222960
MADLINK: Attentive multihop and entity descriptions for link prediction in knowledge graphs
  • Jan 12, 2024
  • Semantic Web
  • Russa Biswas + 2 more

Knowledge Graphs (KGs) comprise of interlinked information in the form of entities and relations between them in a particular domain and provide the backbone for many applications. However, the KGs are often incomplete as the links between the entities are missing. Link Prediction is the task of predicting these missing links in a KG based on the existing links. Recent years have witnessed many studies on link prediction using KG embeddings which is one of the mainstream tasks in KG completion. To do so, most of the existing methods learn the latent representation of the entities and relations whereas only a few of them consider contextual information as well as the textual descriptions of the entities. This paper introduces an attentive encoder-decoder based link prediction approach considering both structural information of the KG and the textual entity descriptions. Random walk based path selection method is used to encapsulate the contextual information of an entity in a KG. The model explores a bidirectional Gated Recurrent Unit (GRU) based encoder-decoder to learn the representation of the paths whereas SBERT is used to generate the representation of the entity descriptions. The proposed approach outperforms most of the state-of-the-art models and achieves comparable results with the rest when evaluated with FB15K, FB15K-237, WN18, WN18RR, and YAGO3-10 datasets.

  • PDF Download Icon
  • Conference Article
  • Cite Count Icon 73
  • 10.18653/v1/2022.acl-long.201
Sequence-to-Sequence Knowledge Graph Completion and Question Answering
  • Jan 1, 2022
  • Apoorv Saxena + 2 more

Knowledge graph embedding (KGE) models represent each entity and relation of a knowledge graph (KG) with low-dimensional embedding vectors. These methods have recently been applied to KG link prediction and question answering over incomplete KGs (KGQA). KGEs typically create an embedding for each entity in the graph, which results in large model sizes on real-world graphs with millions of entities. For downstream tasks these atomic entity representations often need to be integrated into a multi stage pipeline, limiting their utility. We show that an off-the-shelf encoder-decoder Transformer model can serve as a scalable and versatile KGE model obtaining state-of-the-art results for KG link prediction and incomplete KG question answering. We achieve this by posing KG link prediction as a sequence-to-sequence task and exchange the triple scoring approach taken by prior KGE methods with autoregressive decoding. Such a simple but powerful method reduces the model size up to 98% compared to conventional KGE models while keeping inference time tractable. After finetuning this model on the task of KGQA over incomplete KGs, our approach outperforms baselines on multiple large-scale datasets without extensive hyperparameter tuning.

  • Research Article
  • Cite Count Icon 28
  • 10.1016/j.eswa.2020.114164
KGEL: A novel end-to-end embedding learning framework for knowledge graph completion
  • Oct 27, 2020
  • Expert Systems with Applications
  • Adnan Zeb + 4 more

KGEL: A novel end-to-end embedding learning framework for knowledge graph completion

  • Video Transcripts
  • 10.48448/vffx-4m23
Sequence-to-Sequence Knowledge Graph Completion and Question Answering
  • May 7, 2022
  • Rainer Gemulla + 2 more

Knowledge graph embedding (KGE) models represent each entity and relation of a knowledge graph (KG) with low-dimensional embedding vectors. These methods have recently been applied to KG link prediction and question answering over incomplete KGs (KGQA). KGEs typically create an embedding for each entity in the graph, which results in large model sizes on real-world graphs with millions of entities. For downstream tasks these atomic entity representations often need to be integrated into a multi stage pipeline, limiting their utility. We show that an off-the-shelf encoder-decoder Transformer model can serve as a scalable and versatile KGE model obtaining state-of-the-art results for KG link prediction and incomplete KG question answering. We achieve this by posing KG link prediction as a sequence-to-sequence task and exchange the triple scoring approach taken by prior KGE methods with autoregressive decoding. Such a simple but powerful method reduces the model size up to 98% compared to conventional KGE models while keeping inference time tractable. After finetuning this model on the task of KGQA over incomplete KGs, our approach outperforms baselines on multiple large-scale datasets without extensive hyperparameter tuning.

  • Research Article
  • Cite Count Icon 15
  • 10.1109/tbdata.2018.2867583
Link Prediction in Knowledge Graphs: A Hierarchy-Constrained Approach
  • Jun 1, 2022
  • IEEE Transactions on Big Data
  • Manling Li + 4 more

Link prediction over a knowledge graph aims to predict the missing head entities <inline-formula><tex-math notation="LaTeX">$h$</tex-math></inline-formula> or tail entities <inline-formula><tex-math notation="LaTeX">$t$</tex-math></inline-formula> and missing relations <inline-formula><tex-math notation="LaTeX">$r$</tex-math></inline-formula> for a triple <inline-formula><tex-math notation="LaTeX">$(h,r,t)$</tex-math></inline-formula> . Recent years have witnessed great advance of knowledge graph embedding based link prediction methods, which represent entities and relations as elements of a continuous vector space. Most methods learn the embedding vectors by optimizing a margin-based loss function, where the margin is used to separate negative and positive triples in the loss function. The loss function utilizes the general structures of knowledge graphs, e.g., the vector of <inline-formula><tex-math notation="LaTeX">$r$</tex-math></inline-formula> is the translation of the vector of <inline-formula><tex-math notation="LaTeX">$h$</tex-math></inline-formula> and <inline-formula><tex-math notation="LaTeX">$t$</tex-math></inline-formula> , and the vector of <inline-formula><tex-math notation="LaTeX">$t$</tex-math></inline-formula> should be the nearest neighbor of the vector of <inline-formula><tex-math notation="LaTeX">$h+r$</tex-math></inline-formula> . However, there are many particular structures, and can be employed to promote the performance of link prediction. One typical structure in knowledge graphs is hierarchical structure, which existing methods have much unexplored. We argue that the hierarchical structures also contain rich inference patterns, and can further enhance the link prediction performance. In this paper, we propose a hierarchy-constrained link prediction method, called hTransM, on the basis of the translation-based knowledge graph embedding methods. It can adaptively determine the optimal margin by detecting the single-step and multi-step hierarchical structures. Moreover, we prove the effectiveness of hTransM theoretically, and experiments over three benchmark datasets and two sub-tasks of link prediction demonstrate the superiority of hTransM.

  • Research Article
  • Cite Count Icon 4
  • 10.1016/j.neucom.2022.06.032
A probabilistic ensemble approach for knowledge graph embedding
  • Jun 8, 2022
  • Neurocomputing
  • Yinquan Wang + 3 more

A probabilistic ensemble approach for knowledge graph embedding

  • Research Article
  • 10.1016/j.procs.2024.09.509
DKGR: A Distributed Framework for Geometric Knowledge Graph Embedding with Ray
  • Jan 1, 2024
  • Procedia Computer Science
  • Mohammed Khatbane + 3 more

DKGR: A Distributed Framework for Geometric Knowledge Graph Embedding with Ray

  • PDF Download Icon
  • Conference Article
  • Cite Count Icon 1
  • 10.18653/v1/2021.findings-emnlp.238
Open-Domain Contextual Link Prediction and its Complementarity with Entailment Graphs
  • Jan 1, 2021
  • Mohammad Javad Hosseini + 3 more

An open-domain knowledge graph (KG) has entities as nodes and natural language relations as edges, and is constructed by extracting (subject, relation, object) triples from text. The task of open-domain link prediction is to infer missing relations in the KG. Previous work has used standard link prediction for the task. Since triples are extracted from text, we can ground them in the larger textual context in which they were originally found. However, standard link prediction methods only rely on the KG structure and ignore the textual context that each triple was extracted from. In this paper, we introduce the new task of open-domain contextual link prediction which has access to both the textual context and the KG structure to perform link prediction. We build a dataset for the task and propose a model for it. Our experiments show that context is crucial in predicting missing relations. We also demonstrate the utility of contextual link prediction in discovering context-independent entailments between relations, in the form of entailment graphs (EG), in which the nodes are the relations. The reverse holds too: context-independent EGs assist in predicting relations in context.

  • Video Transcripts
  • 10.48448/wbny-e915
Open-Domain Contextual Link Prediction and its Complementarity with Entailment Graphs
  • Oct 23, 2021
  • Mark + 3 more

An open-domain knowledge graph (KG) has entities as nodes and natural language relations as edges, and is constructed by extracting (subject, relation, object) triples from text. The task of open-domain link prediction is to infer missing relations in the KG. Previous work has used standard link prediction for the task. Since triples are extracted from text, we can ground them in the larger textual context in which they were originally found. However, standard link prediction methods only rely on the KG structure and ignore the textual context that each triple was extracted from. In this paper, we introduce the new task of open-domain contextual link prediction which has access to both the textual context and the KG structure to perform link prediction. We build a dataset for the task and propose a model for it. Our experiments show that context is crucial in predicting missing relations. We also demonstrate the utility of contextual link prediction in discovering context-independent entailments between relations, in the form of entailment graphs (EG), in which the nodes are the relations. The reverse holds too: context-independent EGs assist in predicting relations in context.

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.

Search IconWhat is the difference between bacteria and viruses?
Open In New Tab Icon
Search IconWhat is the function of the immune system?
Open In New Tab Icon
Search IconCan diabetes be passed down from one generation to the next?
Open In New Tab Icon