Abstract

Many sequence tasks can be effectively treated as Sequence labeling (SL) problems in Natural Language Processing. A lot of the existing studies solve these tasks as independent sequence labeling problems, or use multiple auxiliary tasks to increase the performance of specific target tasks. These studies ignore the potential relationship and influence between multi-tasks in natural language processing. We propose an innovative and well-designed self-attention based joint sequence model (SA-JSL) in a neural network frame, which fully utilize the possible interactions and influences between many sequence labeling tasks, so as to promote and improve the performance of each sequence labeling task simultaneously. Specifically, by fusing the self-attention and joint tag mechanism, this framework model converts multiple sequence tasks into a unified sequence labeling task for processing, and effectively utilizes the potential relationship between these tasks, so as to realize the mutual promotion and improvement of multi-tasks. The joint model combines with the self-attention mechanism to obtain richer context information and improve the performance of the model. Experiments were carried out on seven popular benchmark datasets to verify the superiority of the model.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.