Abstract

In the era of Web 2.0, people have become accustomed to expressing their attitudes and exchanging opinions on social media sites such as Twitter. It is critical for security and business related applications to make sense of public opinions implied in users’ texts. Stance detection aims to classify the stances users hold towards certain targets as FAVOR, AGAINST or NONE. In the literature, many efforts have been paid on neural network based stance detection to avoid hand-crafted features. As a widely used neural network structure, convolutional neural network (CNN) can mine and combine various local textual features for classifying stances with high training efficiency. However, global textual information is usually neglected in the convolution process. Besides, stance clues are often mixed with less informative words in noisy tweets, and it is hard for CNN to resolve and leverage long-distance semantic dependencies between words effectively. To address these issues, in this paper, we propose CCNN-ASA, the Condensed CNN by Attention over Self-Attention, to detect the stances of tweets. We first introduce self-attention into CNN to adaptively enhance word embeddings with global textual information. To make stance clues close to each other and thus more salient, we further introduce attention-based condensation module which identifies stance-indicative words to condense tweets. Experiments on a benchmark dataset show that CCNN-ASA outperforms state-of-the-art methods in stance detection of tweets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.