Abstract

The word embedding of traditional named entity recognition (NER) methods can’t represent the polysemy of a word, can’t fully consider contextual information, and the local features in the extraction process are easy to be ignored. To solve this problem, a named entity recognition method based on Bert is proposed, and the BERT-BiLSTM-IDCNN-CRF model is built. Bert is used for pre-training, and then the trained word vectors are input into bidirectional long-term and short-term memory network (BiLSTM) and iterative dilation convolution network (IDCNN) to extract features. Then the output features of the two neural networks are combined, and finally the prediction results are corrected by conditional random field (CRF). The experimental results show that the F1 value of the model on the CLUENER data set reaches 81.18%, which is 4.79% higher than the F1 value of the BiLSTM-CRF bench mark model, which verifies the effectiveness of the method in the NER task.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call