Abstract

Student feedback is a critical component of the teacher-learner cycle. However, there is not a gold standard course or clerkship evaluation form and limited research on the impact of changing the evaluation process. Results from a focus group and pre-implementation feedback survey coupled with best practices in survey design were used to improve all course/clerkship evaluation for academic year 2013-2014. In spring 2014 we asked all subjected students in University of Utah School of Medicine, United States of America to complete the same feedback survey (post-implementation survey). We assessed the evaluation climate with 3 measures on the feedback survey: overall satisfaction with the evaluation process; time students gave effort to the process; and time students used shortcuts. Scores from these measures were compared between 2013 and 2014 with Mann-Whitney U-tests. Response rates were 79% (254) for 2013 and 52% (179) for 2014. Students’ overall satisfaction score were significantly higher (more positive) post-implementation compared to pre-implementation (P<0.001). There was no change in the amount of time students gave effort to completing evaluations (P=0.981) and no change for the amount of time they used shortcuts to complete evaluations (P=0.956). We were able to change overall satisfaction with the medical school evaluation culture, but there was no change in the amount of time students gave effort to completing evaluations and times they used shortcuts to complete evaluations. To ensure accurate evaluation results we will need to focus our efforts on time needed to complete course evaluations across all four years.

Highlights

  • Student feedback is a critical component of the teacher-learner cycle

  • “How would you define good teaching?” “What do you think about the evaluation tools currently used at our institution?” “How do you arrive at an overall course rating?” “What kind of consequences would you like to see to be drawn from course evaluations?”

  • In academic year (AY) 2014 we introduced an optional midpoint formative survey with four items so that course directors could review feedback and make changes before the end of a course

Read more

Summary

1.88 Not applicable

There was no change in the amount of time students said they gave effort to completing evaluations (P = 0.981) and no change for the amount of time they used shortcuts to complete evaluations (P= 0.956). There was no change in the amount of time students gave effort to completing evaluation and times they used shortcuts to complete evaluations. We did substantially decrease the number of items and times students completed evaluations pre- and post-implementation. We took the low response rate as an indication that students were not negatively fired up about the new course evaluation process. To ensure educators get accurate feedback from students we will need to focus our efforts on time needed to complete course evaluations across all years of medical school. Future research will need to determine the usefulness evaluation feedback to course/clerkship directors

Please comment on the course strengths
Findings
Please comment on the clerkship strengths
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call