Abstract
BackgroundLow response rates and inadequate question comprehension threaten the validity of survey results. We describe a simple procedure to implement personalized—as opposed to generically worded—questionnaires in the context of a complex web-based survey of corresponding authors of a random sample of 300 published cluster randomized trials. The purpose of the survey was to gather more detailed information about informed consent procedures used in the trial, over and above basic information provided in the trial report. We describe our approach—which allowed extensive personalization without the need for specialized computer technology—and discuss its potential application in similar settings.ResultsThe mail merge feature of standard word processing software was used to generate unique, personalized questionnaires for each author by incorporating specific information from the article, including naming the randomization unit (e.g., family practice, school, worksite), and identifying specific individuals who may have been considered research participants at the cluster level (family doctors, teachers, employers) and individual level (patients, students, employees) in questions regarding informed consent procedures in the trial. The response rate was relatively high (64 %, 182/285) and did not vary significantly by author, publication, or study characteristics. The refusal rate was low (7 %).ConclusionWhile controlled studies are required to examine the specific effects of our approach on comprehension, quality of responses, and response rates, we showed how mail merge can be used as a simple but useful tool to add personalized fields to complex survey questionnaires, or to request additional information required from study authors. One potential application is in eliciting specific information about published articles from study authors when conducting systematic reviews and meta-analyses.
Highlights
Low response rates and inadequate question comprehension threaten the validity of survey results
A third and preferred option was to generate a unique, personalized questionnaire for each sample member. This would allow us to customize the questions to each author by incorporating study-specific information from the published article, including naming the randomization unit or “cluster”, and referring to specific individuals who might be considered potential participants at the cluster level and individual level
There were no important differences in response rates among the subgroups examined: response rates were similar among primary authors from economically developed and developing countries (p = 0.39), by year of publication (p = 0.36), by journal impact factor (p = 0.95) and among studies conducted in health care organization settings compared to studies conducted in public health and health promotion (p = 0.43)
Summary
Low response rates and inadequate question comprehension threaten the validity of survey results. We were faced with several challenges in designing our questionnaire, including diversity of the sample with respect to country and research area of study authors, potential lack of familiarity with the concept of cluster randomization, lack of standard definitions of “gatekeepers” and “research participants”, and diversity of key elements of the trials themselves including types of clusters (e.g., medical practices, schools, communities, work sites, sports teams), types of participants at the individual level and/ or cluster level, and specific study interventions and data collection procedures that may or may not have required consent These challenges made a traditional survey that presented questions in a uniform way problematic to operationalize. Faceto-face and web-based questionnaire customization has been used for decades at survey organizations, supported by advances in computer-assisted interviewing [8, 9], we were restricted to the use of a simple and cheap selfadministered method that did not require specialized computer technology
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.