Abstract

The Ayala and Elder article succinctly summarizes the best practices for designing and evaluating social-behavioral interventions to reduce oral health disparities. As a community-based researcher, I would like to add a few points on design and implementation for consideration. I have always been impressed by the level of sophistication and knowledge of community members 1, 2. Community members know that researchers work to promote specific aims and are under pressure to deliver to their funders. Consequently, there is a chance that participation in focus groups and interview sessions could be biased by the belief that researchers have already defined what they want to hear. The Ayala and Elder paper describes many tools and models; however, to design outcome-driven interventions, researchers should engage with community members and listen to cues and stories. Focus groups and interviews, while useful, are not sufficient. Ayala and Elder state that “participants are not provided with a complete intervention to evaluate.” I advise that engaging community members in evaluation of the final intervention is necessary as well. The piecemeal development of an intervention does not portray the full scope of final outcomes to community members. It is imperative that after completing the development phase, focus groups and interviews should be organized to evaluate the acceptability of the full intervention. Building trust in community-based participatory research requires more than just engagement in the development phases of a study. The authors suggest inviting or preselecting community members to participate in focus groups or interviews. Although this is a standard approach in organizing focus groups, often these invitees are friends of staff, members of organizations with experience in dealing with researchers, or members of community centers or programs affiliated with universities which can result in significant selection bias. At worse, selected participants may give responses they believe the researchers want to hear. Another possible bias to consider is how different choices are presented to interviewees. For example, in a large study evaluating the utility of current and new interventions for toothache, a large representative sample was selected for one-to-one interviews at the participants' homes 1. The interviews indicated that the participants preferred a new treatment over conventional ones. Further exploration identified that the participants were providing the research team with answers that they thought the team preferred. Bias in design and execution can be subtle and require in-depth semi-structured interviews. A final suggestion is to engage the community members to think in “systems of interventions.” Logic models provide a great tool for engagement of community leaders 3. Adding this exercise to the design phase of a study would significantly enhance the acceptability, and maybe even the relevance of any intervention to underserved communities. Community-based research is difficult to plan and execute and most researchers are not well trained to work with and approach communities. Without an in-depth understanding of underserved communities, research to reduce disparities will have limited impact. The author has been the recipient of an NIDCR grant.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.