Abstract

AbstractUser-intent classification is a sub-task in natural language understanding of human-computer dialogue systems. To reduce the data volume requirement of deep learning for intent classification, this paper proposes a transfer learning method for Chinese user-intent classification task, which is based on the Bidirectional Encoder Representations from Transformers (BERT) pre-trained language model. First, a simulation experiment on 31 Chinese participants was implemented to collect first-handed Chinese human-computer conversation data. Then, the data was augmented through back-translation and randomly split into the training dataset, validation dataset and test dataset. Next, the BERT model was fine-tuned into a Chinese user-intent classifier. As a result, the predicting accuracy of the BERT classifier reaches 99.95%, 98.39% and 99.89% on the training dataset, validation dataset and test dataset. The result suggests that the application of BERT transfer learning has reduced the data volume requirement for Chinese intent classification task to a satiable level.KeywordsUser-intent classificationHuman-computer dialogue systemPre-trained language modelTransfer learning

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.