Abstract

Crowdsourcing has become an important tool to collect data for various artificial intelligence applications, and auction can be an effective way to allocate work and determine reward in a crowdsourcing platform. In this article, we focus on the crowdsourcing of small tasks such as image labeling and voice recording, where we face a number of challenges. First, workers have different limits on the amount of work they would be willing to do, and they may also misreport these limits in their bid for the work. Second, if the auction is repeated over time, unsuccessful workers may dropout of the system, reducing competition and diversity. To tackle these issues, we first extend the results of the celebrated Myerson's optimal auction mechanism for a single-parameter bid to the case where the bid consists of the unit cost of work, the maximum amount of work one is willing to do, and the actual work completed. We show that a simple payment mechanism is sufficient to ensure a dominant strategy from the workers, and that this dominant strategy is robust to the true utility function of the workers. Second, we propose a novel, flexible work allocation mechanism, which allows the requester to balance between cost efficiency and equality. While cost minimization is obviously important, encouraging equality in the allocation of work increases the diversity of the workforce as well as promotes a long-term participation on the crowdsourcing platform. Our main results are proved analytically and validated through simulations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call