Abstract

Neural network models have been demonstrated to be capable of achieving state-of-the-art performance in sentence sentiment classification. Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs) are two widely used neural network models for NLP. However, since sentences consist of the same words in different order may represent different meaning in sentiment, it cannot be neglected that the word embedding training model ignores the factor of word order in sentence to quicken the training process. In this work, we mainly consider that word order is important for sentence sentiment classification, designing an encode-decode model called CNN-LSTM combined the strength of CNN and LSTM to demonstrate that word order of sentence plays an important role in sentiment analysis based on the word embedding which is designed as an order_w2v model taking in word order during word2vec training process. We evaluate the CNN-LSTM and order_w2v in sentiment classification both on Chinese and English datasets. The experimental results verify that the model considering the word order can achieve better results in sentiment analysis.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.