Abstract

Educational emergencies provide a lot of information. Aiming at the problem that traditional named entity recognition methods in this field cannot represent the ambiguity of a word, this article proposes a named entity recognition method BERT+BiLSTM-CRF in the field of educational emergencies based on the BERT pre-training language model. Firstly, BERT is trained on the corpus of educational emergencies to obtain the vectorized representation of the words, and then the context encoding of the serialized text is obtained using BiLSTM, and then the sequence is decoded and annotated by CRF to obtain the corresponding entities in the educational emergencies. Experiments show that the BERT+BiLSTM-CRF fusion model has achieved an accuracy of 91.62% on the educational emergency data set, which is a significant improvement compared to the traditional named entity recognition model.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.