Abstract
Disease Named Entity Recognition (DNER) is an emerging field of research that identifies disease mentions in health-related texts. Downstream tasks of DNER includes exploring the cause of a given disease, analysis of relationships among diseases, prevention mechanisms and specialized treatment procedures. Various models have been proposed for DNER, including dictionary-based approaches, rule-based systems, machine learning models with handcrafted features and deep learning models. Though deep learning models perform better for DNER, they highly depend on large annotated datasets, which is often not feasible. Multi-task approaches that combine manually annotated datasets with similar domains boost the performance of DNER; however, it is often difficult to find the combination of datasets that give a performance gain. This study addresses such drawbacks by investigating the applicability of transfer learning via Bidirectional Encoder Representations from Transformers (BERT) and proposes a model using BERT and Conditional Random Fields (CRF), which performs better than similar state-of-the-art models based on general deep learning and multi-task learning methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.