Abstract

Exposure to patients and clinical diagnoses drives learning in graduate medical education (GME). However, variation exists in the breadth of experiences. Measuring such variation would provide practice data to inform residents’ understanding of the breadth of their patient experiences. We have developed an automated system to identify resident provider-patient interactions (rPPIs) and demonstrated accurate attribution at a single institution. The objective of this study was to understand the landscape of trainee planned learning, and iteratively design a tool to be used for this goal. To achieve these objectives at two institutions new to the AMA “Advancing Change” initiative, we used a mixed-methods approach to develop and evaluate a “mid-point report” of patients encounters. Qualitative outcomes include a guided exploration of usefulness, usability, and intent to use, as well as understanding the resources trainees would use for learning and how our system may deliver these resources. Quantitative outcomes from a summative usability test of the midpoint report will include time on task, task completion rate, and proportion of trainees who perceive the report to be useful to identify gaps in clinical experiences and guide learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call