Abstract Event extraction is based on the event in the text as the subject information, based on the predefined event type and template, the structured event information is extracted, the existing event extraction model is mainly in the general domain, ignoring the prior knowledge in the domain and the dependency information between entities, and the existing methods do not address the problem of event theory dispersion and multiple events. In response to the above issues, this paper proposes a model based on Fin-Bert (Financial Bidirectional Encoder Representation from Transformers) and RATT (Relation-Augmented Attention Transformer). At the same time, this paper will make use of the structured self-attention mechanism to extract the dependencies between entities, use RAAT to fuse the dependency information between entities into sentence coding, and finally use the binary classification method to identify type of event and generate event records. Compared with the baseline method, the F1 value of the event extraction task on the ChFinAnn and Duee-fin datasets was improved by 2.5% and 2.8%, respectively.
Read full abstract