Abstract

The task of text summarization has been widely concerned in the fields of news, document indexing and literature retrieval. Recently, due to the rise of mobile Internet devices, natural language processing for AIoT and edge computing has become a hot spot. This paper focuses on the research of text summarization for AIoT and edge computing. For a long time, abstractive summarization is limited to academic research due to the lack of controllability of the generated content. Recently, the appearance of Transformer has changed the current situation of abstractive text summarization. Transformer follows encoder‐decoder architecture, including attention mechanism and feed‐forward network. The encoder encodes the semantic information of source text, and decoder adaptively selects the effective context information through the attention mechanism to generate a coherent summary. To extract more semantic information and control the generated text better, this paper proposes multi‐scale semantic information Transformer (MSIT). Specifically, we introduce depth‐wise separable convolution to the encoder to extract more local semantic information, so that the attention mechanism can make better use of contextual semantic information. Additionally, we combine the encoding vector of encoder and target summary as the input to the attention layer of the decoder, and introduce time series mechanism so that the decoder can consider context information when generating text. Experiments on CNN‐Daily Mail Dataset show that this model is superior to other methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call