Abstract

Background: Systems-based practice is one of the six general competencies proposed by the Accreditation Council for Graduate Medical Education in their Outcome Project. However, little has been published on its assessment—possibly because the systems-based practice competency has been viewed as difficult to define and measure. Purpose: The purpose of this study was to determine whether a full performance-based examination of systems-based practice cases simulated and scored by standardized participants in the health care system could feasibly be constructed and implemented that would provide reliable and valid measurements. Methods: In the 1st year of the project (2008), four systems-based practice cases were developed and pilot tested with 13 residents. Videotapes of residents were studied to develop an instrument for subsequent assessment of performance by standardized participants. In the 2nd year (2009), the examination was expanded to a full 12 cases, which were completed by 11 second-year residents, and psychometric analyses were performed on the scores. Results: The generalizability coefficient for the full 12-case examination based on scoring by standardized participants was .71, which is nearly equal to that based on scoring by faculty physician observers, which was .78. The correlation between total scores obtained with standardized participants and physician observers was .78. Conclusions: A performance-based examination can provide a feasible and reliable assessment of systems-based practice. However, attempts to evaluate convergent validity and discriminant validity—by correlating systems-based practice performance assessments with mean global ratings of residents on the 6 competencies by faculty throughout training—were unsuccessful, due to a lack of independence between the rated dimensions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call