Abstract

The citation intent extraction and classification has long been studied as it is a good measure of relevancy. Different approaches have classified the citations into different classes; including weak and strong, positive and negative, important and unimportant. Others have gone further from binary classification to multi-classes, including extension, use, background, or comparison. Researchers have utilized various elements of the information, including both meta and contents of the paper. The actual context of any referred article lies within the citation context where a paper is referred. Various attempts have been made to study the citation context to capture the citation intent, but very few have encoded the words to their contextual representations. For automated classification, we need to train deep learning models, which take the citation context as input and provides the reason for citing a paper. Deep neural models work on numeric data, and therefore, we must convert the text information to its numeric representation. Natural languages are much complex than computer languages. Computer languages have a pre-defined fixed syntax where each word has a unique meaning. In contrast, every word in natural language may have a different meaning and may well be understood by understanding the position, previous discussion, and neighboring words. The extra information provides the context of a word within a sentence. We have, therefore, used contextual word representation, which is trained through deep neural networks. Deep models require massive data for generalizing the model, however, the existing state-of-the-art datasets don’t provide much information for the training models to get generalized. Therefore, we have developed our own scholarly dataset, Citation Context Dataset with Intent (C2D-I), an extension of the C2D dataset. We used a transformers based model for capturing the contextual representation of words. Our proposed method outperformed the existing benchmark methods with F1 score of 89%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call