Abstract

Mobile data crowdsourcing has found a broad range of applications (e.g., spectrum sensing, environmental monitoring) by leveraging the wisdom of a potentially large crowd of (i.e., mobile users). A key metric of crowdsourcing is data accuracy, which relies on the quality of the participating workers' data (e.g., the probability that the data is equal to the ground truth). However, the data quality of a worker can be its own private information (which the worker learns, e.g., based on its location) that it may have incentive to misreport, which can in turn mislead the crowdsourcing requester about the accuracy of the data. This issue is further complicated by the fact that the worker can also manipulate its effort made in the crowdsourcing task and the data reported to the requester, which can also mislead the requester. In this paper, we devise truthful crowdsourcing mechanisms for Quality, Effort, and Data Elicitation (QEDE), which incentivize strategic workers to truthfully report their private worker quality and data to the requester, and make truthful effort as desired by the requester. The truthful design of the QEDE mechanisms overcomes the lack of ground truth and the coupling in the joint elicitation of worker quality, effort, and data. Under the QEDE mechanisms, we characterize the socially optimal and the requester's optimal task assignments, and analyze their performance. We show that the requester's optimal assignment is determined by the largest virtual valuation rather than the highest quality among workers, which depends on the worker's quality and the quality's distribution. We evaluate the QEDE mechanisms using simulations which demonstrate the truthfulness of the mechanisms and the performance of the optimal task assignments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call