Abstract

Since schools cannot use face-to-face tests to evaluate students’ learning effectiveness during the COVID-19 pandemic, many schools implement computer-based tests (CBT) for this evaluation. From the perspective of Sustainable Development Goal 4, whether this type of test conversion affects students’ performance in answering questions is an issue worthy of attention. However, studies have not yielded consistent findings on the equivalence of the scores of examinees’ answering performance on computer-based tests (CBT) and paper-and-pencil tests (PPT) when taking the same multiple-choice tests. Some studies have revealed no significant differences, whereas others have exhibited significant differences between the two formats. This study adopted a counterbalanced experimental design to investigate the effects of test format, computerised presentation type, difficulty of item group, and administration order of item groups of different difficulty levels on examinees’ answering performance. In this study, 381 primary school fifth graders in northern Taiwan completed an achievement test on the topic of Structure and Functions of Plants, which is part of the primary school Natural Science course. The achievement test included 16 multiple-choice items. After data collection and analysis, no significant differences in the answering performance of examinees were identified among the PPT, CBT with single-item presentation, and CBT with multiple-item presentation. However, after further analysis, the results indicated that the difficulty of item group and the administration order of item groups of different difficulty levels had significant influences on answering performance. The findings suggest that compared with a PPT, examinees exhibit better answering performance when taking multiple-choice tests in a CBT with multiple-item presentation.

Highlights

  • Since the year 2019, with the outbreak of the COVID-19 pandemic, many countries have curbed the spread of the virus by reducing crowd movement or close interaction, and temporarily closing certain places, such as educational institutions and public recreational places

  • This result suggests that without considering the factors of difficulty of item group and administration order of item groups of different difficulty levels, the correct answering rates on the achievement test were not significantly different based on whether the test items were presented in a paper-and-pencil tests (PPT) and CBT with single-item presentation (CS) or in a PPT and CM

  • This study revealed that in the difficult-item group, the test format and administration order had significant influences on the correct answering rate of the achievement test: (1) PPT and CS—in the PPT, compared with when the difficult-item group is presented in the second part, the correct answering rate was significantly higher when the difficult-item group was presented in the first part; when the difficult-item group was presented in the second part, the correct answering rate in CS was significantly higher than that in the PPT; (2) PPT and CM—the correct answering rate of difficult-item group in the CM was significantly higher than that in the PPT; administration order had no significant influences on the correct answering rate

Read more

Summary

Introduction

Since the year 2019, with the outbreak of the COVID-19 pandemic, many countries have curbed the spread of the virus by reducing crowd movement or close interaction, and temporarily closing certain places, such as educational institutions and public recreational places. Since the information and communication technology (ICT) resources each student can access are different, which might affect the opportunities to learn and the fairness of evaluation, in addition to providing online courses to continue educational activities, schools have needed to modify the original evaluation standard and method to ensure the fairness of evaluation during the school closures [2,3]. As students are unable to take face-to-face paper-and-pencil tests, Sustainability 2021, 13, 9548. Sustainability 2021, 13, 9548 take-home exams—one of the most common alternative assessment methods—are often delivered online in the form of CBT, allowing students to read on screen or in paper-based format at home [1,2]. In addition to the COVID-19 pandemic era, CBT has gradually been applied to various assessment activities. CBTs analyse and provide the required feedback immediately to facilitate recording and querying, and test items and shared item databases can be generated [4,5]

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call