Abstract

Open (open-book) online assessment has become a great tool in higher education, which is frequently used for monitoring learning progress and teaching effectiveness. It has been gaining popularity because it is flexible to use and makes response behavior data available for researchers to study response processes. However, some challenges are encountered in analyzing these data, such as how to handle outlying response time, how to make use of the information from item response order, how item response time, response order and item scores are related, and how to help classroom teachers quickly check whether student responses are aligned with the design of the assessment. The purposes of this study are 3-fold: (1) to provide a solution for handling outlying response times due to the design of open online formative assessments (i.e., ample or unrestricted testing time), (2) to propose a new measure for investigating the item response order, and (3) to discuss two analytical approaches that are useful for studying response behaviors–data visualization and the Bayesian generalized linear mixed effects model (B-GLMM). An application of these two approaches is illustrated using open online quiz data. Our findings obtained from B-GLMM showed that item response order was related to item response time, but not to item scores; and item response time was related to item scores, but its effects were moderated by the cognitive level. Additionally, the findings from both B-GLMM and data visualization were consistent, which assisted instructors to see the alignment of student responses with the assessment design.

Highlights

  • Over the past decades, open online assessments have gained popularity through Massive Open Online Courses (MOOCs), such as Coursera and Edx

  • The course instructor confirmed that these were expected because most students had learned related topics for items #9 and #10, which led to higher correct rates and less response times, whereas item #3 was a new topic and might require more time to recall a large amount of reading materials even if it was designed as a low cognitive level item

  • The results suggested that item response order and item cognitive level were related to the item response time

Read more

Summary

Introduction

Open (or “open book”) online assessments have gained popularity through Massive Open Online Courses (MOOCs), such as Coursera (https://www.coursera.org/) and Edx (https://www.edx.org/). Students can view the course materials while taking the test in a location and at a pace of their own wish. Another advantage of a formative assessment, when computerized, is that it can automatically record a variety of information in addition to the students’ answers and their marks. Response behaviors recorded in the open online assessment platform are called computer log data. Computer log data can keep track of student response behaviors in the process of completing the assessment. Making good use of these data collected online can help monitor students’ learning progress and provide tailored support in time, and shed light on how student response behaviors are related to their assessment performance

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call