Abstract

Named Entity Recognition (NER) is an important task in knowledge extraction, which targets extracting structural information from unstructured text. To fully employ the prior-knowledge of the pre-trained language models, some research works formulate the NER task into the machine reading comprehension form (MRC-form) to enhance their model generalization capability of commonsense knowledge. However, this transformation still faces the data-hungry issue with limited training data for the specific NER tasks. To address the low-resource issue in NER, we introduce a method named active multi-task-based NER (AMT-NER), which is a two-stage multi-task active learning training model. Specifically, A multi-task learning module is first introduced into AMT-NER to improve its representation capability in low-resource NER tasks. Then, a two-stage training strategy is proposed to optimize AMT-NER multi-task learning. An associated task of Natural Language Inference (NLI) is also employed to enhance its commonsense knowledge further. More importantly, AMT-NER introduces an active learning module, uncertainty selective, to actively filter training data to help the NER model learn efficiently. Besides, we also find different external supportive data under different pipelines improves model performance differently in the NER tasks. Extensive experiments are performed to show the superiority of our method, which also proves our findings that the introduction of external knowledge is significant and effective in the MRC-form NER tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call