PurposeMedicine is practiced in a collaborative and interdisciplinary manner. However, medical training and assessment remain largely isolated in traditional departmental silos. Two Entrustable Professional Activities (EPAs) developed by the American Board of Surgery are multidisciplinary in nature and offer a unique opportunity to study interdisciplinary assessment.MethodsEPA microassessments were collected from Surgery and Emergency Medicine (EM) faculty between July 2018 and May 2020. Differences in feedback provided by faculty were assessed using natural language processing (NLP) techniques, (1) automated algorithms; and (2) topic modeling. Summative content analysis was used to identify themes in text feedback. We developed automated coding algorithms for these themes using regular expressions. Topic modeling was performed using latent Dirichlet allocation.Results549 assessments were collected for two EPAs: 198 for GS Consultation and 351 for Trauma. 27 EM and 27 Surgery faculty provided assessments for 71 residents. EM faculty were significantly more likely than Surgery faculty to submit feedback coded as Communication, Demeanor, and Timeliness, (all chi-square test p-values < 0.01). No significant differences were found for Clinical Performance, Skill Level, or Areas for Improvement. Similarly, topic modeling indicated that assessments submitted by EM faculty focused on communication, timeliness, and interpersonal skills, while those submitted by Surgery faculty focused on the residents’ abilities to effectively gather information and correctly diagnose the underlying pathology.ConclusionsFeedback from EM and Surgery faculty differed significantly based on NLP analyses. EPA assessments should stem from multiple sources to avoid assessment gaps and represent a more holistic picture of performance.
Read full abstract