Abstract

Attention mechanism is an increasingly important approach in the field of natural language processing (NLP). In the attention-based named entity recognition (NER) model, most attention mechanisms can calculate attention coefficient to express the importance of sentence semantic information but cannot adjust the position distribution of contextual feature vectors in the semantic space. To address this issue, a radial basis function attention (RBF-attention) layer is proposed to adaptively regulate the position distribution of sequence contextual feature vectors, which can minimize the relative distance of within-category named entities and maximize the relative distance of between-category named entities in the semantic space. The experimental results on CoNLL2003 English and MSRA Chinese NER datasets indicate that the proposed model performs better than other baseline approaches without relying on any external feature engineering.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call