Abstract

Relation classification is the task of identifying relations between two entities in a sentence, which is an essential step in the standard NLP pipeline. Most of the previous models only make use of dependency or semantic features, which may result in the loss of vital information. In this paper, we propose a novel model that incorporates dependency and semantic information for relation classification. This is a neural network model using long short-term memory(LSTM), graph convolutional networks(GCN), and convolutional neural networks(CNN), named LGCNN. Concretely, it utilizes self-attention and LSTM to capture the local context in sentences. What’s more, it uses graph convolution network to encode dependency information and takes advantage of convolution neural network to encode semantic information from the local context. Experiments on the SemEval-2010 Task 8 and KBP37 dataset demonstrate that our model is very effective in relation classification.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call