Abstract

Researchers in the social sciences are increasingly turning to online data collection panels for research purposes. While there is evidence that crowdsourcing platforms such as Amazon's Mechanical Turk can produce data as reliable as more traditional survey collection methods, little is known about Amazon's Mechanical Turk's most experienced respondents, their perceptions of crowdsourced data, and the degree to which these affect data quality. The current study utilises both quantitative and qualitative data to investigate Amazon's Mechanical Turk Masters' perceptions and attitudes related to the data quality (e.g. inattention). Recommendations for researchers using crowdsourcing data are provided.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call