Abstract
In Chinese sentiment analysis tasks, many existing methods tend to use recurrent neural networks (e.g., long short-term memory networks and gated recurrent units) and standard one-dimensional convolutional neural networks (1D-CNN) to extract features. This is because a recurrent neural network can deal with the order dependence of the data to a certain extent and the one-dimensional convolution can extract local features. Although these methods have good performance in sentiment analysis tasks, recurrent neural networks (RNNs) cannot be parallelized, resulting in time-inefficiency, and the standard 1D-CNN can only extract a single sample feature, with the result that the feature information cannot be fully utilized. To this end, in this paper, we propose a multichannel two-dimensional convolutional neural network based on interactive features and group strategy (MCNN-IFGS) for Chinese sentiment analysis. Firstly, we no longer use word encoding technology but use character-based integer encoding to retain more fine-grained information. Besides, in character-level vectors, the interactive features of different elements are introduced to improve the dimensionality of feature vectors and supplement semantic information so that the input matches the model network. In order to ensure that more sentiment features are learned, group strategies are used to form several feature mapping groups, so the learning object is converted from the traditional single sample to the learning of the feature mapping group, so as to achieve the purpose of learning more features. Finally, multichannel two-dimensional convolutional neural networks with different sizes of convolution kernels are used to extract sentiment features of different scales. The experimental results on the Chinese dataset show that our proposed method outperforms other baseline and state-of-the-art methods.
Highlights
Nowadays, social media and online shopping platform are widely used, and many users are happy to share their opinions and comments on social media and shopping platforms
Based on the challenges of the above methods, and inspired by the great success of two-dimensional convolution in the image field and the multichannel network, we propose a multichannel two-dimensional convolutional neural network based on interactive features and group strategy (MCNN-IFGS) for Chinese sentiment analysis tasks
We propose a multichannel two-dimensional convolutional neural network (MCNN-IFGS) for sentiment analysis
Summary
Social media and online shopping platform are widely used, and many users are happy to share their opinions and comments on social media and shopping platforms. The sentiment analysis method based on traditional machine learning often trains a sentiment classifier through a given dataset, and uses the sentiment classifier to predict sentiment polarity. Some progress has been made in sentiment analysis based on traditional machine learning, due to the limitations of this method itself, it cannot represent sentiment features well, and the use of emotional information in the training process is limited. Sensors 2022, 22, 714 have made progress in sentiment analysis tasks compared to sentiment dictionary-based methods and traditional machine learning methods, a single neural network model still cannot fully extract sentiment features. Based on the challenges of the above methods, and inspired by the great success of two-dimensional convolution in the image field and the multichannel network, we propose a multichannel two-dimensional convolutional neural network based on interactive features and group strategy (MCNN-IFGS) for Chinese sentiment analysis tasks.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.