Script event prediction aims to infer subsequent events given an incomplete script. It requires a deep understanding of events, and can provide support for a variety of tasks. Existing models rarely consider the relational knowledge between events, they regard scripts as sequences or graphs, which cannot capture the relational information between events and the semantic information of script sequences jointly. To address this issue, we propose a new script form, relational event chain, that combines event chains and relational graphs. We also introduce a new model, relational-transformer, to learn embeddings based on this new script form. In particular, we first extract the relationship between events from an event knowledge graph to formalize scripts as relational event chains, then use the relational-transformer to calculate the likelihood of different candidate events, where the model learns event embeddings that encode both semantic and relational knowledge by combining transformers and graph neural networks (GNNs). Experimental results on both one-step inference and multistep inference tasks show that our model can outperform existing baselines, indicating the validity of encoding relational knowledge into event embeddings. The influence of using different model structures and different types of relational knowledge is analyzed as well.
Read full abstract