Abstract

Relation classification is an important task in natural language processing (NLP) fields. State-of-the-art methods are mainly based on deep neural networks. This paper proposes a new convolutional neural network (CNN) architecture which combines the syntactic tree structure and other lexical level features together for relation classification. In our method, each word in the input sentence is first represented as a k-size word sequence which contains the context information of the considering word. Then each of such word sequence is parsed into a syntactic tree structure and this kind of tree structure is further mapped into a real-valued vector. Finally, concatenated with the attention features for the words among the marked entities, all of these features are fed into a CNN model for relation decision. We evaluate our method on the SemEval 2010 relation classification task and experimental results show that our method outperforms previous state-of-the-art methods under the condition of without using external linguistic resources like WordNet.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call