Abstract
With a tremendous growth in the number of scientific papers, researchers have to spend too much time and struggle to find the appropriate papers they are looking for. Local citation recommendation that provides a list of references based on a text segment could alleviate the problem. Most existing local citation recommendation approaches concentrate on how to narrow the semantic difference between the scientific papers' and citation context's text content, completely neglecting other information. Inspired by the successful use of the encoder-decoder framework in machine translation, we develop an attention-based encoder-decoder (AED) model for local citation recommendation. The proposed AED model integrates venue information and author information in attention mechanism and learns relations between variable-length texts of the two text objects, i.e., citation contexts and scientific papers. Specifically, we first construct an encoder to represent a citation context as a vector in a low-dimensional space; after that, we construct an attention mechanism integrating venue information and author information and use RNN to construct a decoder, then we map the decoder's output into a softmax layer, and score the scientific papers. Finally, we select papers which have high scores and generate a recommended reference paper list. We conduct experiments on the DBLP and ACL Anthology Network (AAN) datasets, and the results illustrate that the performance of the proposed approach is better than the other three state-of-the-art approaches.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.