Abstract
Purpose With the development of Web information systems, steel e-commerce platforms have accumulated a large number of quality objection texts. These texts reflect consumer dissatisfaction with the dimensions, appearance and performance of steel products, providing valuable insights for product improvement and consumer decision-making. Currently, mainstream solutions rely on pre-trained models, but their performance on domain-specific data sets and few-shot data sets is not satisfactory. This paper aims to address these challenges by proposing more effective methods for improving model performance on these specialized data sets. Design/methodology/approach This paper presents a method on the basis of in-domain pre-training, bidirectional encoder representation from Transformers (BERT) and prompt learning. Specifically, a domain-specific unsupervised data set is introduced into the BERT model for in-domain pre-training, enabling the model to better understand specific language patterns in the steel e-commerce industry, enhancing the model’s generalization capability; the incorporation of prompt learning into the BERT model enhances attention to sentence context, improving classification performance on few-shot data sets. Findings Through experimental evaluation, this method demonstrates superior performance on the quality objection data set, achieving a Macro-F1 score of 93.32%. Additionally, ablation experiments further validate the significant advantages of in-domain pre-training and prompt learning in enhancing model performance. Originality/value This study clearly demonstrates the value of the new method in improving the classification of quality objection texts for steel products. The findings of this study offer practical insights for product improvement in the steel industry and provide new directions for future research on few-shot learning and domain-specific models, with potential applications in other fields.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.