Abstract

Recent studies have empirically validated the data obtained from Amazon’s Mechanical Turk. Amazon’s Mechanical Turk workers behaved similarly not only in simple surveys but also in tasks used in cognitive behavioral experiments that employ multiple trials and require continuous attention to the task. The present study aimed to extend these findings to data from Japanese crowdsourcing pool in which participants have different ethnic backgrounds from Amazon’s Mechanical Turk workers. In five cognitive experiments, such as the Stroop and Flanker experiments, the reaction times and error rates of Japanese crowdsourcing workers and those of university students were compared and contrasted. The results were consistent with those of previous studies, although the students responded more quickly and poorly than the workers. These findings suggested that the Japanese crowdsourcing sample is another eligible participant pool in behavioral research; however, further investigations are needed to address issues of qualitative differences between student and worker samples.

Highlights

  • Researchers in the behavioral and social sciences have begun to collect data from online surveys and online experiments using participants recruited from an online labor market, a procedure known as crowdsourcing

  • The present study aimed to extend the findings from a previous validation study of AMT as a participant pool for online cognitive behavioral experiments to a Japanese sample collected from a Japanese crowdsourcing service

  • Recent studies with AMT as a participant pool have suggested that Internet-based behavioral experiments work well, even in cognitive tasks that require precise millisecond control for stimulus presentation and response collection within multiple trials (Crump et al, 2013)

Read more

Summary

Introduction

Researchers in the behavioral and social sciences (such as psychology, linguistics, economics, and political science) have begun to collect data from online surveys and online experiments using participants recruited from an online labor market, a procedure known as crowdsourcing. The demographic status and various psychological properties of workers were contrasted with those of students or agematched community samples Similar results were reported by Goodman et al (2013), in which the workers and students showed similar classic decision-making biases, such as risk preference and the certainty effect

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call