Abstract

Context: Existing studies have shown that award settings for software crowdsourcing tasks can be accurately predictable to reflect the size and complexity of the tasks. However, as one of the most important motivating factors for on line crowdsourcing workers, it is more important for the task requesters to not only be able to estimate what the nominal price should be? for their tasks, but also to justify what the strategic price should be? in order to incentivize broader worker participation as well as higher quality of final submissions. Goal: To address the above questions, this paper reports an empirical study to develop further understanding about the relationship between tasks award and associated worker behaviors. Method: We develop a conceptual award-behavior model, formulate a set of research questions about the relationships of award and worker's behavior and performance, and conduct 4 empirical studies on 514 crowdsourcing tasks extracted from TopCoder platform. Results: Major results include: (1) in general, negative correlations between award and worker behavior metrics; (2) a decreasing tendency in making submission as the number of registrants increases; (3) a weak positive correlation of 0.19 between number of registrants and score of the winning submission; and (4) for similar tasks, the relationship of award on worker behavior follows a variety of inverted U-shape curves. Conclusions: We believe the preliminary findings are helpful for task requesters in better task planning, and hope to stimulate further discussions and research in strategic crowd coordination.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call