Abstract

Story ending generation is a challenging and under-explored task, which aims at generating a coherent, reasonable, and logical story ending given a context. Previous studies mainly focus on utilizing the contextual information and commonsense knowledge to generate story endings. However, there are still some issues must be addressed in the story endings generation processing, such as sentimental consistency and interference from secondary information. In this paper, we propose a Gated Mechanism based Transformer Network (GMTF). The GMTF model utilizes the sentimental trend to make story ending generation more sentimentally consistent with the context. For a given story context, we utilize a sentiment analysis tool VADER to obtain the sentimental trend. Then, the sentimental information and contextual information are input jointly into the transformer network to capture the key clues. Furthermore, the gated mechanism is applied to filter irrelative information and the weights of attention layers for encoder and decoder are shared to make the most of the contextual clues. The experimental results on ROCStories dataset demonstrate that the proposed method achieves 27.03% on BLEU-1, 7.62% on BLEU-2, 1.71 on Grammar, and 1.31 on Logicality, respectively. Specifically, our model outperforms the state-of-the-art model IE+MSA by 0.23%, 0.22%, 1.78%, 5.64%, respectively and the Transformer model by 3.06%, 1.05%, 5.55%, 48.86%, respectively. Both automatic and manual evaluations show that our model can generate more reasonable and appropriate story endings compared with the related well-established approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call