Abstract

Event Causality Extraction (ECE) plays an essential role in many Natural Language Processing (NLP), such as event prediction and dialogue generation. Recent research in NLP treats ECE as a sequence labeling problem. However, these methods tend to extract the events and their relevant causality using a single collapsed model, which usually focuses on the textual contents while ignoring the intra-element transitions inside events and inter-event causality transition association across events. In general, ECE should condense the complex relationship of intra-event and the causality transition association among events. Therefore, we propose a novel dual-channel enhanced neural network to address this limitation by taking both global event mentions and causality transition association into account. To extract complete event mentions, a Textual Enhancement Channel(TEC) is constructed to learn important intra-event features from the training data with a wider perception field. Then the Knowledge Enhancement Channel(KEC) incorporates external causality transition knowledge using a Graph Convolutional Network (GCN) to provide complementary information on event causality. Finally, we design a dynamic fusion attention mechanism to measure the importance of the two channels. Thus, our proposed model can incorporate both semantic-level and knowledge-level representations of events to extract the relevant event causality. Experimental results on three public datasets show that our model outperforms the state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call