Abstract

Named Entity Recognition (NER) is generally regarded as a sequence labeling task, and faces a serious problem when the named entities are nested. Span-based model, which enumerates all possible spans as potential entity mentions in a sentence and classifies them, is straightforward for nested NER but faces negative samples problems. In this paper, we propose a span-based nested NER model with BERT and try to solve the negative samples problems. In view of the phenomenon that there are too many negative samples in all spans, we employ a multi-task learning method, which divides NER task into entity identification and entity classification task. In addition, we propose the entity IoU loss function to focus our model on the hard negative samples. Our model is evaluated on three nested NER datasets: GENIA, ACE2004 and ACE2005, and the results show that our model outperforms other state-of-the-art models with the same pretrained language model, achieving 79.46%, 87.30% and 85.24% respectively in terms of F1 score.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call