Abstract

Relation Extraction (RE) and Entity Typing (ET) are two important tasks in natural language processing field. Existing methods for RE and ET usually handle them separately. However, relation extraction and entity typing have strong relatedness with each other, since entity types are informative for inferring relations between entities, and the relations can provide important information for predicting types of entities. Exploiting the relatedness between relation extraction and entity typing has the potential to improve the performance of both tasks. In this paper, we propose a neural network based approach to jointly train relation extraction and entity typing models using a multi-task learning framework. For relation extraction, we adopt a piece-wise Convolutional Neural Network model as sentence encoder. For entity typing, since there are multiple entities in one sentence, we design a couple-attention model based on Bidirectional Long Short-Term Memory network to obtain entity-specific representation of sentences. In our MTL frame, the two tasks share not only the low-level input embeddings but also the high-level task-specific semantic representations with each other. The experiment results on benchmark datasets demonstrate that our approach can effectively improve the performance of both relation extraction and entity typing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call