Abstract

In the era of big data, a large amount of unstructured text data springs up every day. Entity linking involves relating the mentions found in the texts to the corresponding entities, which stand for objective things in the real world, in a knowledge base. This task can help computers understand semantics in the texts correctly. Although there have been numerous approaches employed in research such as this, some challenges are still unresolved. Most current approaches utilize neural models to learn important features of the entity and mention context. However, the topic coherence among the referred entities is frequently ignored, which leads to a clear preference for popular entities but poor accuracy for less popular ones. Moreover, the graph-based models face much noise information and high computational complexity. To solve the problems above, the paper puts forward an entity linking algorithm derived from the asymmetric graph convolutional network and the contextualized semantic relevance, which can make full use of the neighboring node information as well as deal with unnecessary noise in the graph. The semantic vector of the candidate entity is obtained by continuously iterating and aggregating the information from neighboring nodes. The contextualized relevance model is a symmetrical structure that is designed to realize the deep semantic measurement between the mentions and the entities. The experimental results show that the proposed algorithm can fully explore the topology information of the graph and dramatically improve the effect of entity linking compared with the baselines.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call