Abstract

Poor response rate, self-selection bias, and item noncompletion negatively impact the generalization of results from surveys. This study examined differences in these factors between a paper and online survey among allied health clinicians. Clinicians within a large local health district were initially invited to complete the Research Capacity in Context Tool online via an e-mail link. Following a lower-than-expected response rate, potential selection bias, and item noncompletion, the survey was readministered in paper form to the same cohort of clinicians 6-12 months later. The response rate to the paper survey was higher than to the online survey (27.6% vs. 16.5%). Selection biases were evident, characterized by seniority and discipline: Junior clinicians responded at rates significantly less than expected to the online survey but as expected to the paper survey. Occupational therapists, speech pathologists, and podiatrists responded more highly to the online survey, while other disciplines responded more highly to the paper survey. The rate of item noncompletion was higher for online than paper survey (6.72% vs. 3.8% questions not completed, respectively), with patterns of noncompletion also differing. These data suggest paper surveys are likely to produce less biased and more generalizable data from allied health clinicians.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call