Abstract

Crowdsourcing and other human computation techniques have proven useful in collecting large numbers of annotations for various datasets. In the majority of cases, online platforms are used when running crowdsourcing campaigns. Local crowdsourcing is a variant where annotation is done on specific physical locations. This paper describes a local crowdsourcing concept, platform and experiment. The case setting concerns eliciting annotations for an audio archive. For the experiment, we developed a hardware platform designed to be deployed in building elevators. To evaluate the effectiveness of the platform and to test the influence of location on the annotation results, an experiment was set up in two different locations. In each location two different user interaction modalities are used. The results show that our simple local crowdsourcing setup is able to achieve acceptable accuracy levels with up to 4 annotations per hour, and that the location has a significant effect on accuracy.

Highlights

  • We explore local crowdsourcing which mimics much of the features of online crowdsourcing with the addition of exploiting the characteristics of the physical location and the relation that contributors can have with this location

  • Considering the Spatial Content Production Model (SCPM) classification (Hecht and Gegle, 2010) our experiment follows a hybrid approach by introducing a task for the creation of User Generated Content (UGC)

  • Through the deployment of the ElevatorAnnotator, as described above, we collected a significant number of annotations

Read more

Summary

Introduction

1. GOALS Crowdsourcing is a proven method for outsourcing a variety of tasks to a large network of people (Howe, 2006). GOALS Crowdsourcing is a proven method for outsourcing a variety of tasks to a large network of people (Howe, 2006) This includes annotation and enrichment tasks, where for specific datasets, crowd annotators are asked to provide (additional) metadata. We explore local crowdsourcing which mimics much of the features of online crowdsourcing with the addition of exploiting the characteristics of the physical location and the relation that contributors can have with this location. We investigate the effectiveness of local crowdsourcing in the generation of accurate metadata enrichments for archival content collections. We present the implementation of this method, called the ElevatorAnnotator platform, based on small and affordable hardware

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.