Abstract

We tackle the problems of both event and entity relation extraction, and come up with a novel method to implement joint extraction: iteratively interactive learning. This method is motivated by the empirical findings as below: the extracted event attributes (e.g., trigger and event type) can be used as the reliable features for the recognition of entity relation types, and vice versa. Accordingly, on one hand, we utilize the predicted event attributes (by a certain event extraction system) to remodel the distributed representations of features for entity relation extraction, and on the other hand, we use entity relations (recognized by a certain relation extraction system) to remodel the features for event extraction. This enables a double-channel task-independent joint model with an interactive learning: learning events for relation extraction, and meanwhile learning relations for event extraction. In practice, we perform the interactive learning in an iterative manner, so as to boost the joint model progressively. Methodologically, we take the neural network of bidirectional long short-term memory (Bi-LSTM) for learning event and relation respectively. And as usual, the attention mechanism is used. In our experiments, the automatic content extraction corpus is used for the evaluation of the proposed method. Such a corpus consists of event, entity and relation samples with gold-standard attribute tags. Experimental results show that our method outperforms the baselines (Bi-LSTMs with attention without interactive learning) in both event and relation extraction tasks, yielding performance gains of about 1.6% and 1.8% F-scores respectively, at the condition of low-resource setting.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call