Abstract

Sarcasm is a statement that conveys an opposing viewpoint viapositive or exaggeratedly positive phrases. Due to this intentionalambiguity, sarcasm identification has become one of the importantfactors in sentiment analysis that make many researchers in naturallanguage processing intensively study sarcasm detection. Thisresearch is using multiple channels embedding the attentionbidirectional long-short memory (MCEA-BLSTM) model thatexplored sarcasm detection in news headlines and has differentapproach from previous research-developed models that lexical,semantic, and pragmatic properties. This research found that multiplechannels embedding attention mechanism improve the performance ofBLSTM, making it superior to other models. The proposed methodachieves 96.64% accuracy with an f-measure of 97%

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call