Abstract

Intent recognition and slot filling are two key steps in natural language understanding. In the past, the two steps were often completed separately, and a large number of joint modeling methods have recently demonstrated that the two are closely related and can leverage the shared knowledge between tasks to achieve better performance. Previous studies have focused on multi-task implicit joint modeling or slot filling tasks relying on information from intent recognition, ignoring that intent recognition and slot filling are interrelated. In this paper, the joint model Bi-Correlation is improved to form the cross-correlation of intent classification and slot filling. It includes two modules, IR2SF and SF2IR so that the performance of intention recognition and slot filling can be mutually enhanced; and considering the lack of generalization in the case of small datasets, In this paper, the Bert pre-training model is used to improve the generalization of the model, thereby improving the performance of the model. Experiments are carried out on two Chinese datasets, CAIS and SMP-ECDT. The experimental results show that the model in this paper performs better than the existing models, and the accuracy is significantly improved.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.