Abstract

Text similarity calculation based on deep learning has always been an important research in the field of natural language processing. However, the traditional deep learning model has some disadvantages in the application of text similarity calculation, Text similarity calculation based on deep learning has always been an important research in the field of natural language processing. However, traditional deep learning models have drawbacks in the application of text similarity calculation, such as insufficient extraction of text semantics, inability to combine context, and inability to Understand polysemy and so on. In order to solve the problems of existing text similarity calculation methods based on deep learning, this paper proposes a Siamese network based on ALBERT, uses ALBERT model for word embedding, and proposed the ABSBGRU model by combining Bi-GRU of Siamese structure with attention mechanism. On the premise of minimizing the increased computational cost, we can extract the deep semantic information better. The experimental results show that ABSBGRU model has stronger deep semantic extraction ability. Compared with other traditional models, F1 Score is higher and training cost less than some other models.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.