Abstract
AbstractIn this study, we investigate the attentiveness exhibited by participants sourced through Amazon Mechanical Turk (MTurk), thereby discovering a significant level of inattentiveness amongst the platform’s top crowd workers (those classified as ‘Master’, with an ‘Approval Rate’ of 98% or more, and a ‘Number of HITS approved’ value of 1,000 or more). A total of 564 individuals from the United States participated in our experiment. They were asked to read a vignette outlining one of four hypothetical technology products and then complete a related survey. Three forms of attention check (logic, honesty, and time) were used to assess attentiveness. Through this experiment we determined that a total of 126 (22.3%) participants failed at least one of the three forms of attention check, with most (94) failing the honesty check – followed by the logic check (31), and the time check (27). Thus, we established that significant levels of inattentiveness exist even among the most elite MTurk workers. The study concludes by reaffirming the need for multiple forms of carefully crafted attention checks, irrespective of whether participant quality is presumed to be high according to MTurk criteria such as ‘Master’, ‘Approval Rate’, and ‘Number of HITS approved’. Furthermore, we propose that researchers adjust their proposals to account for the effort and costs required to address participant inattentiveness.
Highlights
Over time, online services for participant recruitment by researchers have increased in popularity [30]
The first examined the frequency with which attention checks were passed or failed by participants; this revealed that 126 of the 564 participants (22.34%) failed at least one form of attention check
Litman et al [25] describe Mechanical Turk (MTurk) as “a constantly evolving marketplace where multiple factors can contribute to data quality”
Summary
Online services for participant recruitment by researchers have increased in popularity [30]. Amazon Mechanical Turk (MTurk; known as Mechanical Turk [15]) is one of the oldest and most frequently selected tools from a spectrum of web-based resources, enabling researchers to recruit participants online and lowering the required time, effort, and cost [24, 39]. More recently (2018), Difallah, Filatova, and Ipeirotis [13] found that at least 100,000 workers were registered on the platform, with 2,000 active at any given time. The authors state that a significant worker turnover exists, with the half-life for workers estimated to be between 12 and 18 months [13] Such numbers as those reported by Stewart et al [40] and Difallah, Filatova, and Ipeirotis [13] indicate that recruiting the same worker more than once for a given experiment is highly unlikely
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.