Abstract

Using online crowdsourcing platforms has become an option for researchers to rapidly recruit participants for completing tasks that require human ingenuity. However, a growing concern is that workers on online crowdsourcing platforms, such as Amazon Mechanical Turk (AMT), receive unfair compensation for tasks completed. In this article, we explored the effects of the income level of participant’s country and the rate of payment on perceived payment fairness, task quality, and subjective experience. We tested our hypothesis using 3-way ANOVA and chi-square test of independence. The results showed that lower compensation increased the number of participants whose data might be unusable for research. Participants in the lower compensation rate group reported better perceived performance compared to those who received higher compensation. We found that high income country participants report less perceived effort and frustration than lower-middle income country participants.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.