Abstract

Crowdsourcing is a novel method of collecting research data from diverse patient populations. The quality of research data obtained through crowdsourcing is unknown. The primary aim of this pilot study was to examine how data collected from an online crowdsourcing World Wide Web site compare with those from published literature in psoriasis and psoriatic arthritis (PsA). Crowdsourced data were collected from a health crowdsourcing site from August 23, 2008, to June 27, 2011. The crowdsourced data were compared with findings from systematic reviews, meta-analyses, and clinical trials. A total of 160 online patients with psoriasis or PsA were included in the analysis. Among them, 127 patients with psoriasis provided 313 complete responses on psoriasis symptoms and 276 complete responses to psoriasis treatments; 33 patients with PsA provided 91 complete responses on PsA symptoms and 79 responses to PsA treatments. We compared topical treatments, phototherapy, and systemic treatments for psoriasis and PsA from crowdsourced data with the published literature. For the treatment with the largest response rates, equivalency testing was performed comparing crowdsourced data and the published literature. Overall, crowdsourced data were not equivalent to those published in the medical literature. Crowdsourcing sites used different outcomes measures from those reported in clinical trials. Differences existed in assessment of treatment effectiveness between crowdsourced data and those published in the literature. With improvements in the collection of crowdsourced data, crowdsourcing can be a valuable tool for collecting patient data in real-world settings for psoriasis and PsA.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call