Abstract

Background There is no clear way to know how changes in a course have impacted the student learning experience. In a recent series of studies, our group established Q-methodology as a robust alternative to traditional Likert-scale course evaluations. Through utilizing a revised statement ranking system, and combining qualitative and quantitative analysis methods, Q-methodology mitigates issues of disparate feedback, and averaged scores which make Likert-scale data difficult to interpret and act upon. Our previous work has demonstrated that students can be statistically grouped based upon shared opinions, preferences and values, and that evidence-based course reform decisions can be made with this information. In a remarkable year, requiring significant and rapid adjustment to accommodate pandemic-related measures, understanding how students perceived their course experience is important not only to examine outcomes but ensure future evidence-based changes are possible. Objective Through sequential Q-methodology course evaluations, this study evaluates how course changes, made in response to emergency pandemic teaching/learning, alter students’ perceptions of their course experience. Methods In 2019 and 2020, students in a multi-disciplinary undergraduate anatomy and physiology course were asked to complete the same Q-methodology course evaluation, ranking 37 statements relative to each other at year end. Between academic years, significant changes in course delivery (laboratory and tutorial sessions moved online) as well assessment modality (multiple choice exams changed to a group assignment) were made. Data from both 2020 (n=64) and 2019 (n=125) were analyzed together via by-person factor analysis, and factors were interpreted using Q-methodology conventions. Participant distributions across the factors were compared between cohorts (2020 vs 2019) and across demographics via Pearson's Chi-square test. Results A three-factor solution statistically grouped students with a generally 1) positive, 2) negative and 3) neutral disposition toward the course. Those in the 2019 and 2020 cohorts were significantly differently distributed amongst the factors (p = 0.006), such that a greater proportion of the 2020 vs 2019 cohort (58% vs 37%) expressed dissatisfaction with the course. Primarily, dissatisfaction surrounded issues with course structure and assessment – two aspects which were significantly altered in response to the 2020 pandemic. Demographics were not significantly different across factors. Conclusion Clear evidence for course reform relies upon knowing what worked, and what didn't. In this case, while a negative shift in disposition seems unremarkable, Q-methodology uncovered key elements contributing to student dissatisfaction which are important to consider as programs consider online-learning as a long-term option.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call