Abstract

Due to the complex semantics of natural language, the multi-sentiment polarity of words, and the long-dependence of sentiments between words, the existing sentiment analysis methods (especially Chinese textual sentiment analysis) still face severe challenges. Aware of these issues, this paper proposes a scalable multi-channel dilated joint architecture of convolutional neural network and bidirectional long short-term memory (CNN–BiLSTM) model with an attention mechanism to analyze the sentiment tendency of Chinese texts. Through the multi-channel structure, this model can extract both the original context features and the multiscale high-level context features. Importantly, the number of the model channel can be optimally expanded according to the actual corpus. Furthermore, the attention mechanism including local attention and global attention is adopted to further distinguish the difference of features. The former is employed to weight the output features of each channel, and the latter is used to weight the fused features of all channels. Besides, an adaptive weighted loss function is designed to effectively avoid the imbalance of classes in training data. Finally, several experiments are performed to demonstrate the superior performance of the proposed model on two public datasets. Compared with word-level methods, the accuracy and Macro-F1 are respectively increased by over 1.19% and 0.9% on NLPCC2017-ECGC corpus, the accuracy and F1 are respectively increased by more than 1.7% and 1.214% on ChnSentiCorp-Htl-unba-10000 corpus. Compared with char-level pre-training methods, the accuracy and Macro-F1 also respectively achieve an improvement of over 3.416% and 4.324% on the NLPCC2017-ECGC corpus, the accuracy and F1 are respectively increased by more than 0.14% and 3% on the ChnSentiCorp-Htl-unba-10000 corpus.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call