Abstract

Recent advances in deep neural models allow us to build reliable named entity recognition (NER) systems without handcrafting features. However, such methods require large amounts of manually-labeled training data. There have been efforts on replacing human annotations with distant supervision (in conjunction with external dictionaries), but the generated noisy labels pose significant challenges on learning effective neural models. Here we propose two neural models to suit noisy distant supervision from the dictionary. First, under the traditional sequence labeling framework, we propose a revised fuzzy CRF layer to handle tokens with multiple possible labels. After identifying the nature of noisy labels in distant supervision, we go beyond the traditional framework and propose a novel, more effective neural model AutoNER with a new Tie or Break scheme. In addition, we discuss how to refine distant supervision for better NER performance. Extensive experiments on three benchmark datasets demonstrate that AutoNER achieves the best performance when only using dictionaries with no additional human effort, and delivers competitive results with state-of-the-art supervised benchmarks.

Highlights

  • Extensive efforts have been made on building reliable named entity recognition (NER) models without handcrafting features (Liu et al, 2018; Ma and Hovy, 2016; Lample et al, 2016)

  • We propose AutoNER, a novel neural model with the new Tie or Break scheme for the distantly supervised NER task

  • We design and explore two neural models, Fuzzy-LSTM-conditional random field (CRF) with the modified IOBES scheme and AutoNER with the Tie or Break scheme, to learn named entity taggers based on such labels with unknown and multiple types

Read more

Summary

Introduction

Extensive efforts have been made on building reliable named entity recognition (NER) models without handcrafting features (Liu et al, 2018; Ma and Hovy, 2016; Lample et al, 2016). Most existing methods require large amounts of manually annotated sentences for training supervised models (e.g., neural sequence models) (Liu et al, 2018; Ma and Hovy, 2016; Lample et al, 2016; Finkel et al, 2005). 1 https://github.com/shangjingbo1226/ AutoNER of various domain-specific systems in a plug-inand-play manner

Overview
Fuzzy-LSTM-CRF with Modified IOBES
AutoNER with “Tie or Break”
Corpus-Aware Dictionary Tailoring
Unknown-Typed High-Quality Phrases
Experiments
Experimental Settings
Compared Methods
Method
NER Performance Comparison
Distant Supervision Explorations
Comparison with Gold Supervision
Related Work
Findings
Conclusion and Future Work
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.