Abstract
Background: Crowdsourcing, an emerging paradigm to accomplish tasks by calling unknown workers across the internet to compete, is gaining more and more popularity in various domains. Crowdsourcing task requesters usually offer different bonuses to assure desired worker performance. Most existing studies focus on the general crowdsourcing market, and lead to inconsistent observations on the impact of different incentive strategies on worker performance. There is a lack of studies investigating this issue in crowdsourcing more complex or intelligent tasks such as software crowdsourcing. Aims: To bridge the gap and develop better understanding of the relationship between task incentives and worker performance in the field of software crowdsourcing, this study aims at examining strategic pricing behaviors of task requesters on the most popular software crowdsourcing platform, i.e. TopCoder, and evaluating the impact of monetary incentives on worker performance. Method: We first present the characterization of two specific pricing strategies employed in software crowdsourcing marketplace, design a two-step methodology to detect and identify different pricing strategies, and propose an algorithm to examine the impact of pricing strategies on worker's behaviors in terms of task participation level, completion velocity and task quality. An exploratory case study is conducted to apply the proposed methodology and algorithm on a dataset extracted from the TopCoder platform. Results: The conceptualization of pricing strategies formulates common pricing behaviors in software crowdsourcing. Main analysis results include: 1) strategic pricing patterns are prevalent in software crowdsourcing practices; 2) higher task incentives can get potentially paid-off by higher performance such as more registrants, more submissions and quicker velocity; 3) however, higher incentives do not always improve submission score of software crowdsourcing tasks, similar to moral hazard problems in economics. This implies that it is necessary to increase task award modestly; 4) in addition, higher incentives can improve the internal code which is measured by code bugs and bad smells. Conclusions: We believe the preliminary findings on the pricing strategy are beneficial for both better pricing decision-making and improved crowdsourcing market efficiency and fairness, and hope to stimulate further discussions and research in strategic crowd coordination.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.