Abstract

The objectives of this study were to: (1) describe a standardized clinical reasoning (CR) assessment process for preclinical physician assistant (PA) students; (2) describe student grades on a checklist by comparing clinical faculty members' judgment on a global rating scale (GRS) with judgments made by a faculty panel; and (3) evaluate interrater reliability between individual faculty members' grading and faculty panel grading. Three clinical faculty members created a checklist to assess preclinical PA students' CR on a standardized patient assessment. Individual faculty graders and a panel of faculty graders evaluated student performance. Interrater reliability between individual faculty members and the faculty panel was assessed with Cohen's kappa. The study participants included 88 PA students (n = 88) and 12 faculty evaluators (n = 12). The faculty panel changed 11 grades (12.5%) from individual faculty members. Cohen's kappa indicated substantial agreement (k = 0.698, [95% CI: 0.54-0.85]) between the individual faculty members' grades and the faculty panel's grades. The process of conducting a comparison of a checklist, the GRS, and a panel review improves the standardization of assessment and reduces grade inflation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call