Abstract

Collaborative problem solving (CPS) is a complex skill composed of multiple subskills. While there are several definitions and frameworks for the measurement of CPS, the relationships between the subskills remain unclear. Understanding these relationships can have a direct impact on how assessments are designed, how the measures are developed, and how teaching interventions are planned.The approach outlined here represents use of an inference model in assessment rather than a model in which direct responses to questions with correct and incorrect responses are used to identify abilities. Inferences about capabilities are made based on student behaviors in an online environment as represented through log files generated during test administration. In this study, identified behaviors are common across student, CPS task and assessment. Through the application of an Item Response Theory model, the behaviors can be evaluated much like items on a traditional test. In comparing the difficulty and ordering of items, differences and similarities for each behavior within and across assessments can be examined. This paper presents findings from data collected from over 3000 students that identifies shifts in patterns of student behaviors. The patterns across students, during testing, and across testing situations for CPS subskills such as negotiation, responsiveness, perseverance and communication are presented and discussed. The extent to which assessment design differences between assessments may lead to differences in how students respond to them are highlighted.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call