Abstract

As an essential task in the field of knowledge graph, relation extraction (RE) has received extensive attention from researchers. Since the existing RE methods only adopt one trained word embedding to obtain sentence representation, the polysemy problem cannot be well solved. In order to alleviate the polysemy in RE, this paper proposes a Two-channel model by adopting multiple trained word embeddings, in which one channel is a bidirectional long-short-term memory network based on an attention mechanism (Bi-LSTM-ATT), and the other channel is a convolutional neural network (CNN). Furthermore, a two-channel fusion method is proposed based on this model to deal with polysemy problem in RE. As a result, the Two-channel model achieves 85.42% and 62.2% F1-scores on the Semeval-2010 Task 8 dataset and KBP37 dataset, respectively. The experiment results show that the Two-channel model performs better than most existing models under the condition without using the external features generated by natural language processing (NLP) tools. On the other hand, the two-channel fusion method also obtains a better performance than either concatenation or addition on the two channels.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call