Abstract

AbstractComputer‐based educational assessments often include items that involve drag‐and‐drop responses. There are different ways that drag‐and‐drop items can be laid out and different choices that test developers can make when designing these items. Currently, these decisions are based on experts’ professional judgments and design constraints, rather than empirical research, which might threaten the validity of interpretations of test outcomes. To this end, we investigated the effect of drag‐and‐drop item features on test‐taker performance and response strategies with a cognition‐centered approach. Four hundred and seventy‐six adult participants solved content‐equivalent drag‐and‐drop mathematics items under five design variants. Results showed that: (a) test takers’ performance and response strategies were affected by the experimental manipulations, and (b) test takers mostly used cognitively efficient response strategies regardless of the manipulated item features. Implications of the findings are provided to support test developers’ design decisions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call