Abstract

BackgroundCoreference resolution is the task of finding strings in text that have the same referent as other strings. Failures of coreference resolution are a common cause of false negatives in information extraction from the scientific literature. In order to better understand the nature of the phenomenon of coreference in biomedical publications and to increase performance on the task, we annotated the Colorado Richly Annotated Full Text (CRAFT) corpus with coreference relations.ResultsThe corpus was manually annotated with coreference relations, including identity and appositives for all coreferring base noun phrases. The OntoNotes annotation guidelines, with minor adaptations, were used. Interannotator agreement ranges from 0.480 (entity-based CEAF) to 0.858 (Class-B3), depending on the metric that is used to assess it. The resulting corpus adds nearly 30,000 annotations to the previous release of the CRAFT corpus. Differences from related projects include a much broader definition of markables, connection to extensive annotation of several domain-relevant semantic classes, and connection to complete syntactic annotation. Tool performance was benchmarked on the data. A publicly available out-of-the-box, general-domain coreference resolution system achieved an F-measure of 0.14 (B3), while a simple domain-adapted rule-based system achieved an F-measure of 0.42. An ensemble of the two reached F of 0.46. Following the IDENTITY chains in the data would add 106,263 additional named entities in the full 97-paper corpus, for an increase of 76% percent in the semantic classes of the eight ontologies that have been annotated in earlier versions of the CRAFT corpus.ConclusionsThe project produced a large data set for further investigation of coreference and coreference resolution in the scientific literature. The work raised issues in the phenomenon of reference in this domain and genre, and the paper proposes that many mentions that would be considered generic in the general domain are not generic in the biomedical domain due to their referents to specific classes in domain-specific ontologies. The comparison of the performance of a publicly available and well-understood coreference resolution system with a domain-adapted system produced results that are consistent with the notion that the requirements for successful coreference resolution in this genre are quite different from those of the general domain, and also suggest that the baseline performance difference is quite large.

Highlights

  • Coreference resolution is the task of finding strings in text that have the same referent as other strings

  • It is of interest to many fields, we focus here on the significance of coreference and coreference resolution for natural language processing

  • Characteristics of the first version of the Colorado Richly Annotated Full Text (CRAFT) Corpus that are relevant to the work reported here are that it is a collection of 97 fulllength, open-access biomedical journal articles that have

Read more

Summary

Introduction

Coreference resolution is the task of finding strings in text that have the same referent as other strings. Context and motivation Coreference, broadly construed, is the phenomenon of multiple expressions within a natural language text referring to the same entity or event. As quoted by [4], Halliday and Hasan [5] define the phenomenon of anaphora as “cohesion which points back to some previous item.”. Such cohesion is typically referred to as anaphoric when it involves either pronouns (defined by [6] as “the closed set of items which can be used to substitute for a noun phrase”) or noun phrases or events that are semantically unspecified, i.e. do not refer clearly to a specific individual in some model of the world. The boundaries are fuzzy and not widely agreed upon, and as mentioned above, we take a very inclusive view of coreferential phenomena here

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call