Abstract. Text summarization represents a core research topic within the realm of natural language processing and is extensively applied across various domains, including journalism, library administration, information gathering, among others. With the development of deep learning, especially the proposed Transformer structure has greatly promoted the development of text summaries. This paper reviews the recent progress in Transformer-based text summarization methods. It begins with an overview of traditional text summarization techniques. The paper then delves into the advantages of Transformer models for text summarization, such as their ability to understand global context, dynamically allocate weights, and accelerate parallel computation. Text summarization models are classified into several types, such as abstraction-based, extraction-based, and those leveraging large language models. Notably, Models like PEGASUS, BERT, and HETFORMER have emerged as leading examples in this field. In addition, the effectiveness, advantages and disadvantages of these models are analyzed.
Read full abstract