Abstract

Relation Extraction (RE) is one of the most important tasks in Natural Language Processing (NLP). In recent years, with the development of deep learning, a variety of deep neural networks, such as Convolution Neural Network (CNN), Recurrent Neural Network (RNN) and Long Short Term Memory Network (LSTM), have been used in relation extraction and made significant progress. Moreover, LSTM has become the mainstream model in the field of NLP due to its better long term dependencies capture capability than CNN. However, the ability of LSTM to capture long term dependencies is still limited. In order to solve this problem, we propose a phrase convolution structure. The structure can extract the phrase-level features of the sentence, and the sentence-level features can be further extracted after the features are input into LSTM. We believe that this actually enhances the ability of LSTM to capture long term dependencies. Our experiments on SemEva1-2010 Task 8 dataset show that the performance of our model is better than most existing models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call