Abstract

It is a challenging task to obtain the key semantic information in sentences, which affects the accuracy of abstract generated. A summary model based on localization attention mechanism to extract key semantic information is designed. It mainly consists of three parts: a sequence-to-sequence network based on a location attention and a flashback structure, a selection gate encoder network and a competition mechanism. Firstly, keywords are used to increase the weights of words in key sentences by probability superposition of important information. Secondly, the attention mechanism allocates the weight of the every word in sentences to locate key semantic information. It is used by the decoder to generate the summary. At the same time, key sentences are used on the selection gate encoder network. The semantic allocation probability of words in the sentences is calculated by the key information, then it is decoded to generate a second version of the summary. Furthermore, a competition mechanism is designed to optimize abstract generated by both networks by calculating the cosine distance between both versions of abstract and key sentences. The experimental results show that the test Rouge-1, Rouge-2, Rouge-f are 38.17%, 22.24% and 34.97%, respectively. It indicates that the proposed model outperforms other existing models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call