Abstract

Aspect-based sentiment analysis (ABSA) models typically focus on learning contextual syntactic information and dependency relations. However, these models often struggle with losing or forgetting implicit feature information from shallow and intermediate layers during the learning process, potentially compromising classification performance. We consider the implicit feature information in each layer of the model to be equally important for processing. So, this paper proposes the CABiLSTM-BERT model, which aims to fully leverage implicit features at each layer to address this information loss problem and improve accuracy. The CABiLSTM-BERT model employs a frozen BERT pre-trained model to extract text word vector features, reducing overfitting and accelerating training. These word vectors are then processed through CABiLSTM, which preserves implicit feature representations of input sequences and LSTMs in each direction and layer. The model applies convolution to merge all features into a set of embedding representations after highlighting important features through multi-head self-attention calculations for each feature group. This approach minimizes information loss and maximizes utilization of important implicit feature information at each layer. Finally, the feature representations undergo average pooling before passing through the sentiment classification layer for polarity prediction. The effectiveness of the CABiLSTM-BERT model is validated using five publicly available real-world datasets and evaluated using metrics such as accuracy and Macro-F1. Results demonstrate the model's efficacy in addressing ABSA tasks.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.