Abstract
Text classification task includes total sentence sentimental classification as well as target-based sentimental classification. Target-based sentimental analysis and classification is aiming at locating sentimental classes of given sentences over different opinion aspects. Recurrent neural network is perfectly suitable for this kind of assignment, and it does achieve the state-of-the-art (SOTA) performance by now. Most of the previous works model target and context words with Recurrent Neural Network (RNN) with attention mechanism. However, RNN can hardly parallelize to train and cause too much memory occupancy. What’s more, for this task, long-term memory may cause confusion. For example, the food is delicious but the service is frustrating, where the model may think the food is good while the service is bad. Convolutional neural network (CNN) seems vital in this situation as it can learn the local n-grams information while RNN cannot make it. To address these issues, this paper comes up with an Attention Transformer Network (ATNet) which can perfectly address issues above. Our model employs attention mechanism and transformer component to generate target-orient representation, along with CNN layers to extract N-grams information. On open benchmark datasets, our proposed models achieve state-of-art results, namely, 70.3%, 72.1% and 83.4% in three benchmarks. Also, this paper applies pretrained BERT in the encoder part and acquires SOTA achievement. We performed many contrast experiments to elaborate effectiveness of our method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.