Abstract

This paper presents a broad study on the application of the BERT (Bidirectional Encoder Representations from Transformers) model for multiclass text classification, specifically focusing on categorizing business descriptions into 1 of 13 distinct industry categories. The study involved a detailed fine-tuning phase resulting in a consistent decrease in training loss, indicative of the model’s learning efficacy. Subsequent validation on a separate dataset revealed the model’s robust performance, with classification accuracies ranging from 83.5% to 92.6% across different industry classes. Our model showed a high overall accuracy of 88.23%, coupled with a robust F1 score of 0.88. These results highlight the model’s ability to capture and utilize the nuanced features of text data pertinent to various industries. The model has the capability to harness real-time web data, thereby enabling the utilization of the latest and most up-to-date information affecting to the company’s product portfolio. Based on the model’s performance and its characteristics, we believe that the process of relative valuation can be drastically improved.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call