Abstract

Crowdsourcing leverages human intelligence to gather solutions on tasks that cannot be accomplished by automated tools. This system consists of components such as the requester, task, worker and the crowdsourcing platform. Studies do not explore the various features of these components and the dependencies among the same. Hence, we analyse the characteristics of the components of crowdsourcing systems using a trace-driven approach. Additionally, for reproducible research, we have introduced a workload generator for crowdsourcing platforms, which generates an unbiased workload similar to the empirical workload. Finally, the impact of various characteristics on the quality of answers has been analysed using both the empirical and synthetic workloads. The results demonstrate that success rate and activeness positively affect the productivity of workers, while the number of available human intelligence tasks (HITs) and the time duration of the same affect the productivity on each task.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.