Abstract

Abstract. In this article, we present a two-level approach for the crowd-based collection of vehicles from 3D point clouds. In the first level, the crowdworkers are asked to identify the coarse positions of vehicles in 2D rasterized shadings that were derived from the 3D point cloud. In order to increase the quality of the results, we utilize the wisdom of the crowd principle which says that averaging multiple estimates of a group of individuals provides an outcome that is often better than most of the underlying estimates or even better than the best estimate. For this, each crowd job is duplicated 10 times and the multiple results are integrated with a DBSCAN cluster algorithm. In the second level, we use the integrated results as pre-information for extracting small subsets of the 3D point cloud that are then presented to crowdworkers for approximating the included vehicle by means of a Minimum Bounding Box (MBB). Again, the crowd jobs are duplicated 10 times and an average bounding box is calculated from the individual bounding boxes. We will discuss the quality of the results of both steps and show that the wisdom of the crowd significantly improves the completeness as well as the geometric quality. With a tenfold acquisition, we have achieve a completeness of 93.3 percent and a geometric deviation of less than 1 m for 95 percent of the collected vehicles.

Highlights

  • Crowdsourcing, a neologism of the words “crowd” and “outsourcing” (Howe 2006), describes the outsourcing of activities of companies to an indefinite mass of people through an open call via the internet

  • The main factors, that user collect OSM data voluntarily are, that their contributions are free and that other users benefit from it in the shape of digital maps (Budhathoki and Haythornthwaite 2012). This does not work for all applications: In the field of geodata collection, there are many tasks that could in principle be solved with crowdsourcing, but the bottleneck is finding enough volunteers and building an active community

  • The recruited crowdworkers are pointed via an URL to the Graphical User Interface (GUI) that was installed on our own servers

Read more

Summary

INTRODUCTION

Crowdsourcing, a neologism of the words “crowd” and “outsourcing” (Howe 2006), describes the outsourcing of activities of companies to an indefinite mass of people through an open call via the internet. The main factors, that user collect OSM data voluntarily are, that their contributions are free and that other users benefit from it in the shape of digital maps (Budhathoki and Haythornthwaite 2012) This does not work for all applications: In the field of geodata collection, there are many tasks that could in principle be solved with crowdsourcing, but the bottleneck is finding enough volunteers and building an active community. The crowd is composed of people with unknown and diverse skills, abilities, interests, personal goals, and technical resources (Daniel et al 2018) Another problem - especially in paid crowdsourcing - is dishonest workers who try to maximize their income by submitting as many tasks as possible, producing incomplete or sloppy results (Hirth et al 2011).

DATA COLLECTION
TEST AREA
GUI for coarse positioning of vehicles
Crowdsourcing campaign for coarse positioning
Data integration
Quality analysis
APPROXIMATION WITH MBBS
Quality control on task design
Crowdsourcing campaign
Findings
DISCUSSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call