Abstract

Among Chinese-oriented character relationship extraction methods, most traditional deep learning methods use word2vec as word vector embedding, and use a single convolutional neural network (CNN) or recurrent neural network (RNN). Word2vec cannot solve the problem of polysemy, CNN is not suitable for processing sequence tasks, and RNN cannot fully describe local features. Aiming at the above problems, we propose a method of character relationship extraction based on bidirectional encoder representations from transformers (BERT), CNN and bidirectional gated recurrent unit (BiGRU). In this paper, first, we use BERT to dynamically characterize the characteristics of word vectors. Second, combine the prominent ability of CNN to extract local features and BiGRU's ability to handle serialization tasks. At the same time, we introduce an attention mechanism (ATT) into the model. The experimental results of Soochow University & Setaria Company-Personal relationship extraction data set show that this method has achieved 81 % accuracy. Compared with the single CNN and RNN with word2vec, it has improved the Recall and F1-score.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call