Abstract

Multi-task learning (MTL) has been successfully utilized in numerous NLP tasks, including sequence labeling. In this work, we utilize three transformer-based models (XLM-R, HerBERT, mBERT) to improve recognition quality using MTL for selected low-resource language (Polish) and three disjoint sequence labeling tasks with different levels of inter-annotator agreement. Our best MTL model outperforms single-task models both within the tasks domain and overall performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call