Abstract

Existing event extraction methods independently identify and classify each argument role separately, ignoring the interdependence between different parameter roles. Further, these methods rely on simple vectors to represent word embeddings. By embedding explicit syntactic constraints in the attention mechanism, we address these shortcomings by using dependency syntax to guide the text modeling. Specifically, we use dependency syntax to guide the BERT model for Chinese event argument role extraction, which mainly consists of three stages. First, the self-attention method guided by a dependency syntactic parsing tree is embedded in the Transformer computing framework of the BERT model. In addition to obtaining a deep two-way linguistic representation of a word according to its context information, this method also expresses the long-distance syntactic dependency relationships between words based on context information. Secondly, the designed conditional layer normalization method is applied to the event argument extraction model in order to integrate the semantic information of trigger words into the text, thereby improving the accuracy of the argument role extraction. Lastly, conditional random fields (CRFs) are used to determine the optimal sequence of labels at the sentence level based on the dependence relationships between adjacent labels. According to the experimental results, the constructed model outperforms several strong baselines in the Chinese event argument extraction task on the ACE2005 dataset and the iFLYTEKA.I.2020 dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call