Graduate medical education (GME) programs must ensure that they are able to collect accurate information about resident competence through assessment tools that are fit for purpose. An assessment form or process is called “fit for purpose” when there is good alignment between the tool (how something is assessed) and the intent (the specific knowledge, skill, or attitude/belief that is being assessed).In our family medicine GME program, we identified that the generic workplace-based summative assessment tool provided to community-based family medicine obstetrics (FMOB) clinical teachers was not fit for purpose. As a result, these teachers were uncertain about program expectations regarding competence, the assessment form was challenging and frustrating to complete, and our program struggled to extract useful assessment data from completed assessment forms. To address this issue, we took a systematic approach to develop a workplace-based assessment tool that was specific to the clinical context of FMOB and user-friendly for clinical teachers.Our goals were to develop: (1) a fit-for-purpose workplace-based assessment tool for community-based FMOB teachers that could be used for accurate assessment without the need for faculty development, and (2) an evidence-guided process for designing similar context-specific tools in the future. We met our first goal through designing the FMOB competence rubric (FMOB-CR) tool. The FMOB-CR is comprised of 2 elements. The first is a rubric containing specific statements and examples of resident performance. The rubric clearly outlines the expected level of competence on a 3-point rating scale (“cause for concern,” “acceptable competence,” “exemplary competence”), organized by the 6 Skill Dimensions of Family Medicine (similar to CanMEDS roles).1 The second element is a simple online assessment form in which FMOB teachers specify their resident's level of competence in each Skill Dimension, guided by the statements and examples in the rubric.The FMOB-CR uses plain language and context-specific clinical examples along with a simple online assessment form to clearly communicate expectations of competence to teachers (and residents) without additional faculty development. This user-friendly tool should make it easier for FMOB clinical teachers to better identify residents who are underperforming, which will allow the program to be more effective in intervening and supporting residents in difficulty.To meet our second goal, we developed an evidence-guided process for designing workplace-based assessment tools that are fit for purpose for specific clinical contexts. Our design process included: assembling a tool development team with relevant expertise, including a resident; consultations with FMOB teachers and residents; environmental scan and review of local assessment forms and assessments used by other FMOB GME programs; a modified Delphi process with local GME program faculty to develop and refine the table of competence statements; a consensus-building process within the research team to revise the statements for the rating scale; and implementation of the assessment rubric with concurrent collection of validity evidence. For both the FMOB-CR and the development process, we collected validity evidence according to Messick's unified concept of validity.2The Table details the validity evidence collected to date. Fifteen FMOB teachers surveyed showed strong agreement across 5 items about the utility of the rubric in allowing them to accurately assess residents (overall M=3.25/4, Likert scale 1=strongly disagree to 4=strongly agree), and across 3 items about the usefulness of the rubrics in helping them to understand program expectations of resident competence (overall M=3.36/4); 14 of 15 teachers preferred the new form to the old one.We hope that the worked example of the FMOB-CR and its development process may serve as a blueprint for other institutions to develop context-specific assessment tools.
Read full abstract