Abstract

BackgroundThe assessment of team performance within large-scale Interprofessional Learning (IPL) initiatives is an important but underexplored area. It is essential for demonstrating the effectiveness of collaborative learning outcomes in preparing students for professional practice. Using Kane’s validity framework, we investigated whether peer assessment of student-produced videos depicting collaborative teamwork in an IPL activity was sufficiently valid for decision-making about team performance, and where the sources of error might lie to optimise future iterations of the assessment.MethodsA large cohort of health professional students (n = 1218) of 8 differing professions was divided into teams containing 5–6 students. Each team collaborated on producing a short video to evidence their management of one of 12 complex patient cases. Students from two other teams, who had worked on the same case, individually rated each video using a previously developed assessment scale. A generalisability study quantified sources of error that impacted the reliability of peer assessment of collaborative teamwork. A decision study modeled the impact of differing numbers of raters. A modified Angoff determined the pass/fail mark.ResultsWithin a large-scale learning activity, peer assessment of collaborative teamwork was reliable (G = 0.71) based on scoring by students from two teams (n = 10–12) for each video. The main sources of variation were the stringency and subjectivity of fellow student assessors. Whilst professions marked with differing stringency, and individual student assessors had different views of the quality of a particular video, none of that individual assessor variance was attributable to the assessors’ profession. Teams performed similarly across the 12 cases overall, and no particular professions marked differently on any particular case.ConclusionA peer assessment of a student-produced video depicting interprofessional collaborative teamwork around the management of complex patient cases can be valid for decision-making about student team performance. Further refining marking rubrics and student assessor training could potentially modify assessor subjectivity. The impact of professions on assessing individual peers and the case-specificity of team performances in IPL settings need further exploration. This innovative approach to assessment offers a promising avenue for enhancing the measurement of collaborative learning outcomes in large-scale Interprofessional learning initiatives.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.