Abstract

Current research on examination response time has focused on tests comprised of traditional multiple-choice items. Consequently, the impact of other innovative or complex item formats on examinee response time is not understood. The present study used multilevel growth modeling to investigate examinee characteristics associated with response time differences on a medical certification exam comprised of two item formats. A linear model described examinee pacing on the traditional multiple-choice section, while a quadratic or curvilinear model described pacing on a diagnostic study section comprised of complex graphic-intensive multiple-response items. Examinees’ gender, ability, and age explained variability in response times for each exam section; notably, older examinees’ initial pacing was slower on the multiple-choice section but faster on the complex graphic-intensive multiple-response section. This study has implications for test developers throughout the world who intend to incorporate complex item formats into their high-stakes exams.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call