The RNN network has been widely investigated in automatic abstractive summarization and has achieved good results in previous studies. However, in the process of processing and storing information in the RNN structure, the problem of losing long-term information may occur, resulting in the inability to generate high-quality summaries containing comprehensive information about the corresponding documents. In this paper, in order to overcome this problem as well as enhance the global information, we propose a memory-enhanced abstractive summarization (MEAS) model consisting of a memory enhancement module and a Seq2Seq module. Our model is able to capture and store global information about the entire document, such as the relationship between sentences and sentences, resulting in a richer representation of information to the Seq2Seq module to generate higher quality summaries. Our experimental results indicate that on the CNN/DailyMail corpus, our MEAS model achieve improvements of up to 1.17, 0.27 and 0.85 on the R-1, R-2, and R-L score, respectively, when compared with the related state-of-the-art baseline.