Abstract

Comparative studies on paper and pencil– and computer-based tests principally focus on statistical analysis of students’ performances. In educational assessment, comparing students’ performance (in terms of right or wrong results) does not imply a comparison of problem-solving processes followed by students. In this paper, we present a theoretical tool for task analysis that allows us to highlight how students’ problem-solving processes could change in switching from paper to computer format and how these changes could be affected by the use of one environment rather than another. In particular, the aim of our study lies in identifying a set of indexes to highlight possible consequences that specific changes in task formulation have, in terms of task comparability. Therefore, we propose an example of the use of the tool for comparing paper-based and computer-based tasks.

Highlights

  • This article stems from a wider Ph.D. research focused on comparing students’ problem-solving processes when tackling mathematics tasks in paper-based and computer-based environments (Lemmo, 2017).The increasing use of tests administered in the digital environment allows research in mathematics education to develop new fields of study

  • Research in computer-based tests is concerned with the validity of these tests, while, on the other hand, it focuses on their comparability with existing paper tests

  • One of the first studies conducted on the topic involved the National Assessment of Education Progress (NAEP)

Read more

Summary

Introduction

This article stems from a wider Ph.D. research focused on comparing students’ problem-solving processes when tackling mathematics tasks in paper-based and computer-based environments (Lemmo, 2017). We hypothesise that even a small variation in the information could change the nature of the task as regards actions that can be activated This is not a new problem; research shows that familiarity with the numerical data presented in the information of a word problem can create difficulties and hinder the resolution process. They show that the complexity, in terms of percentage of correct/wrong answers, changes when considering tasks with variation in numerical data In these various studies, Fischbein carried out a very extensive research on word problems, considering the different components and varying them all (either a limited number or one at a time). Two questions may remain of the same type yet vary, depending on the request submitted

Question type
Request type
Structure of the task
Modalities of communicating the response
Presence of information
Actions and operations that the student can activate to solve the task
Systems of signs used to represent the
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call