Abstract
BackgroundOnline crowdsourcing methods have proved useful for studies of diverse designs in the behavioral and addiction sciences. The remote and online setting of crowdsourcing research may provide easier access to unique participant populations and improved comfort for these participants in sharing sensitive health or behavioral information. To date, few studies have evaluated the use of qualitative research methods on crowdsourcing platforms and even fewer have evaluated the quality of data gathered. The purpose of the present analysis was to document the feasibility and validity of using crowdsourcing techniques for collecting qualitative data among people who use drugs. MethodsParticipants (N = 60) with a history of non-medical prescription opioid use with transition to heroin or fentanyl use were recruited using Amazon Mechanical Turk (mTurk). A battery of qualitative questions was included indexing beliefs and behaviors surrounding opioid use, transition pathways to heroin and/or fentanyl use, and drug-related contacts with structural institutions (e.g., health care, criminal justice). ResultsQualitative data recruitment was feasible as evidenced by the rapid sampling of a relatively large number of participants from diverse geographic regions. Computerized text analysis indicated high ratings of authenticity for the provided narratives. These authenticity percentiles were higher than the average of general normative writing samples as well as than those collected in experimental settings. ConclusionsThese findings support the feasibility and quality of qualitative data collected in online settings, broadly, and crowdsourced settings, specifically. Future work among people who use drugs may leverage crowdsourcing methods and the access to hard-to-sample populations to complement existing studies in the human laboratory and clinic as well as those using other digital technology methods.
Accepted Version (
Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have