Abstract

Relation classification is an important semantic processing task in the field of natural language processing (NLP). In this paper, we present a novel model BRCNN to classify the relation of two entities in a sentence. Some state-of-the-art systems concentrate on modeling the shortest dependency path (SDP) between two entities leveraging convolutional or recurrent neural networks. We further explore how to make full use of the dependency relations information in the SDP, by combining convolutional neural networks and twochannel recurrent neural networks with long short term memory (LSTM) units. We propose a bidirectional architecture to learn relation representations with directional information along the SDP forwards and backwards at the same time, which benefits classifying the direction of relations. Experimental results show that our method outperforms the state-of-theart approaches on the SemEval-2010 Task 8 dataset.

Highlights

  • Relation classification aims to classify the semantic relations between two entities in a sentence

  • Our first contribution is that we propose a recurrent convolutional neural network (RCNN) to encode the global pattern in shortest dependency path (SDP) utilizing a two-channel long short term memory (LSTM) based recurrent neural network and capture local features of every two neighbor words linked by a dependency relation utilizing a convolution layer

  • Our second contribution is that we propose a bidirectional recurrent convolutional neural networks (BRCNN) to learn representations with bidirectional information along the SDP forwards and backwards at the same time, which strengthen the ability to classifying directions of relationships between entities

Read more

Summary

Introduction

Relation classification aims to classify the semantic relations between two entities in a sentence. In the sentence “The [burst]e1 has been caused by water hammer [pressure]e2”, entities burst and pressure are of relation CauseEffect(e2, e1). Deep learning techniques have made significant improvement in relation classification, Recently, more attentions have been paid to modeling the shortest dependency path (SDP) of sentences. Liu et al (2015) developed a dependency-based neural network, in which a convolutional neural network has been used to capture features on the shortest path and a recursive neural network is designed to model subtrees. Xu et al (2015b) applied long short term memory (LSTM) based recurrent neural networks (RNNs) along the shortest dependency path. SDP is a special structure in which every two neighbor words are separated by a dependency relations.

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call