Abstract

Entity linking is an important means to identify named entities in text and a key technology for constructing knowledge graphs, playing an important role in fields such as intelligent question answering and information retrieval. However, existing entity linking methods for short texts have low accuracy due to the lack of rich contextual information, informal expression, and incomplete grammar structures. Therefore, this paper proposes a short-text entity linking model based on the RoFormer-Sim pre-training model. Firstly, entity context features are extracted by the RoFormer-Sim pre-training model, and then text similarity calculation and sorting are performed with candidate entity description texts to obtain the corresponding entity in the knowledge base with the disambiguated entity. The experimental results show that the RoFormer-Sim model can provide prior knowledge for entity linking, and the proposed model in this paper has an F1 value of 0.8851, which is better than other entity linking models based on other pre-training models.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.