Abstract

In recent years, there has been a trend towards utilizing Large Language Models (LLMs) to address diverse issues in computational linguistics. More parameters and layers make them better suited to approximating the objective function in multi-domain tasks. In the twenty-first century, since e-commerce has grown quickly, people's interest in gleaning insightful information from other people's evaluations has grown. Conducting fine-grained sentiment analysis is becoming more beneficial. Aspect-Based Sentiment Analysis (ABSA) is a sub-problem within the field of Sentiment Analysis. The ABSA problem encompasses two sub-problems: Identification of targets and Opinion Mining. Existing methods address ABSA well with a relationship between the aspect and the labels. It is the most common situation in application scenarios. This paper innovatively proposes an ABSA method based on zero-shot classification combined and fine-tuned Bidirectional Encoder Representations from Transformers (BERT) model. The method is used experimentally in five tags (1-5) at the "service" and "environment" levels. In the experiment, the predictions of the method-derived model are contrasted with those of the original model. The results show a substantial improvement in the model's prediction accuracy in different aspects.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call