Abstract

Crowdsourcing is a popular means of acquiring data, but the use of such data is limited by concerns with its quality. This is evident within cartography and geographical sciences more generally, with the quality of volunteered geographic information (VGI) recognized as a major challenge to address if the full potential of citizen sensing in mapping applications is to be realized. Here, a means to characterize the quality of volunteers, based only on the data they contribute, was used to explore issues connected with the quantity and quality of volunteers for attribute mapping. The focus was on data in the form of annotations or class labels provided by volunteers who visually interpreted an attribute, land cover, from a series of satellite sensor images. A latent class model was found to be able to provide accurate characterisations of the quality of volunteers in terms of the accuracy of their labelling, irrespective of the number of cases that they labelled. The accuracy with which a volunteer could be characterized tended to increase with the number of volunteers contributing but was typically good at all but small numbers of volunteers. Moreover, the ability to characterize volunteers in terms of the quality of their labelling could be used constructively. For example, volunteers could be ranked in terms of quality which could then be used to select a sub-set as input to a subsequent mapping task. This was particularly important as an identified subset of volunteers could undertake a task more accurately than when part of a larger group of volunteers. The results highlight that both the quantity and quality of volunteers need consideration and that the use of VGI may be enhanced through information on the quality of the volunteers derived entirely from the data provided without any additional information.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call