Abstract
Semantic similarity has always been a difficult problem in NLP task. Learning the deep meaning of text and comparing the similarity between two texts are the link between text representation and upper-level application. In this study, a Siamese ELECTRA Network combined with BERT which named SENB was proposed to solve the semantic similarity problem. The model uses the Siamese Network structure to extract the difference information of sentence pairs, and uses ELECTRA as the encoding layer of the Siamese network to improve the computational efficiency and accuracy. The difference information obtained from the Siamese Network is combined with the interaction features obtained from the BERT to determine the semantic text similarity. Experimental results on TwitterPPDB data set show that the accuracy of the method proposed by SENB method is 82.8%, and the F1-score is 84.1%. The results are superior to other methods, which proves the effectiveness of the method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.