Abstract

The idea that closer things are more related than distant things, known as ‘Tobler’s first law of geography’, is fundamental to understanding many spatial processes. If this concept applies to volunteered geographic information (VGI), it could help to efficiently allocate tasks in citizen science campaigns and help to improve the overall quality of collected data. In this paper, we use classifications of satellite imagery by volunteers from around the world to test whether local familiarity with landscapes helps their performance. Our results show that volunteers identify cropland slightly better within their home country, and do slightly worse as a function of linear distance between their home and the location represented in an image. Volunteers with a professional background in remote sensing or land cover did no better than the general population at this task, but they did not show the decline with distance that was seen among other participants. Even in a landscape where pasture is easily confused for cropland, regional residents demonstrated no advantage. Where we did find evidence for local knowledge aiding classification performance, the realized impact of this effect was tiny. Rather, the inherent difficulty of a task is a much more important predictor of volunteer performance. These findings suggest that, at least for simple tasks, the geographical origin of VGI volunteers has little impact on their ability to complete image classifications.

Highlights

  • Crowdsourcing is a new term for an old, but increasingly important, concept: the completion of large projects by combining small distributed contributions from the public

  • Our results have found support for this notion in some circumstances, but the magnitude of the effect is so small that, at least for cropland detection, it is unlikely to be of much practical importance for future design of global classification tasks

  • It would be interesting to evaluate volunteer performance in a smaller geographical region where the effect of local knowledge may be much more relevant. These results have clear consequences for volunteered geographic information (VGI) campaign implementation. It appears that the geographic origin of the participants has little impact on their ability to identify cropland

Read more

Summary

Introduction

Crowdsourcing is a new term for an old, but increasingly important, concept: the completion of large projects by combining small distributed contributions from the public. Even though the term is not yet widely known, many crowdsourced products are widely used, such as Wikipedia, the online, user-contributed encyclopedia whose popularity—and perhaps even accuracy—rivals traditional reference materials [1]. When the goal of a crowdsourcing campaign is to promote and benefit from active public participation in research, the process is often called ‘citizen science’. Volunteers in the Cropland Capture game showed no direct pattern of work quality as a function of professional background. Among users with >1000 points rated, there was no significant difference of professional background. Users with a background in remote sensing or land cover had an average rate of agreeing with the crowd (Figure 3B). A logit-linked land cover had an average rate of agreeing with the crowd (Figure 3B).

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.