Abstract

With increasing availability of immediate patient access to pathology reports, it is imperative that all physicians be equipped to discuss pathology reports with their patients. No validated measures exist to assess how pathology report findings are communicated during patient encounters. To pilot a scoring rubric evaluating medical students' communication of pathology reports to standardized patients. The rubric was iteratively developed using the Pathology Competencies for Medical Education and Accreditation Council for Graduate Medical Education pathology residency milestones. After a brief training, third- and fourth-year medical students completed 2 standardized patient encounters, presenting simulated benign and malignant pathology reports. Encounters were video recorded and scored by 2 pathologists to calculate overall and item-specific interrater reliability. All students recognized the need for pathology report teaching, which was lacking in their medical curriculum. Interrater agreement was high for malignant report scores (intraclass correlation coefficient, 0.65) but negligible for benign reports (intraclass correlation coefficient, 0). On malignant reports, most items demonstrated good interrater agreement, except for discussing the block (cassette) summary, explaining the purpose of the pathology report, and acknowledging uncertainty. Participating students (N = 9) felt the training was valuable given their limited prior exposure to pathology reports. This pilot study demonstrates the feasibility of using a structured rubric to assess the communication of pathology reports to patients. Our findings also provide a scalable example of training on pathology report communication, which can be incorporated in the undergraduate medical curriculum to equip more physicians to facilitate patients' understanding of their pathology reports.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call