Abstract
The ever-increasing volume of research literature poses challenges for researchers in keeping up with related works in their fields. Automating the generation of related work section holds promise for saving time and effort. However, current models often fall short of producing coherent and logically correct related work with multiple sentences, a phenomenon we refer to as rhetorical structure chaos. Rhetorical structure describes how adjacent spans of units are connected to each other, and logically correct rhetorical structure is essential for a well-structured related work. Hence, to tackle the rhetorical structure chaos issue, this paper explicitly incorporates rhetorical structure information into related work generation. Firstly, we conduct the first rhetorical structure analysis for related work section, which provides insights into understanding the organization and arrangement of contents within related work. Then, based on two preliminary studies on rhetorical structure, we present a novel related work generation model called RSGen, which incorporates rhetorical structure at both the encoding and decoding stages. The encoding stage is facilitated with a rhetorical structure-based graph encoder, while the decoding process is guided by a rhetorical plan — ordered sequence of rhetorical functions of the related work. We conduct extensive experiments on three related work generation datasets to evaluate the performance of our model. The results show that our approach achieves state-of-the-art performance on ROUGE metrics. An ablation study and more analyses further highlight the remarkable efficacy of introducing rhetorical structure into both the encoding and decoding stages.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.