Abstract

AbstractThis paper proposes a new framework that can semantically align the two or more entities in cybersecurity-related Knowledge Graphs (KGs) using an external resource. To do so, we identify four main principles that the external resources must have and then use them to analyze various external resources. The resource is used to find sentences that are needed to understand the usage context of the entities. The entity alignment is performed by semantic embedding with BERT. At this time, semantic embedding is defined as a vector that contains the latent semantic features of the sentences only with similar usage context from the external resource encoded with the language model BERT. To identify the sentences with similar usage context, we first classify the informative entities related to the target entities. Using the informative entities, we generate a set of sentences that have used similar usage context. Finally, to predict semantic relationships (equivalence) between the entities, we employ pre-trained BERT with the set of sentences as input. To prove the superiority of the framework, we perform the experiments to evaluate the accuracy of prediction of equivalence of entities from the different KGs.KeywordsCybersecurityKnowledge graphEntity alignmentSemantic embeddings

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call