Abstract

In recent decades there have been significant changes in the conceptualization of reading as well as in the perception of how this activity should be assessed. Interest in the analysis of reading processes has led to the emergence of new explanatory models based primarily on the contributions of cognitive psychology. In parallel, there have been notable advances in measurement procedures, especially in models based on Item Response Theory (IRT), as well as in the capacity and performance of specific software programs that allow data to be managed and analyzed. These changes have contributed significantly to the rise of testing procedures such as computerized adaptive tests (CATs), whose fundamental characteristic is that the sequence of items presented in the tests is adapted to the level of competence that the subject manifests. Likewise, the incorporation of elements of dynamic assessment (DA) as the prompts are gradually offered allows for obtaining information about the type and degree of support required to optimize the subject’s performance. In this sense, the confluence of contributions from DA and CATs offers a new possibility for approaching the assessment of learning processes. In this article, we present a longitudinal research developed in two phases, through which a computerized dynamic adaptive assessment battery of reading processes (EDPL-BAI) was configured. The research frame involved 1,831 students (46% girls) from 13 public schools in three regions of Chile. The purpose of this study was to analyze the differential contribution on reading competence of dynamic scores obtained in a subsample composed of 324 (47% girls) students from third to sixth grade after the implementation of a set of adaptive dynamic tests of morpho-syntactic processes. The results achieved in the structural equation modeling indicate a good global fit. Individual relationships show a significant contribution of calibrated score that reflects estimated knowledge level on reading competence, as well as dynamic scores based on the assigned value of graduated prompts required by the students. These results showed significant predictive values on reading competence and incremental validity in relation to predictions made by static criterion tests.

Highlights

  • In educational contexts, assessing students’ cognitive skills and reading processes is central to making informed decisions about the support they require to reach their full potential

  • In the Model 1 the Dynamic Assessment (DA) factor was made of the calibrated scores, and the Model 2 explored the dynamic scores based on the inverse of the value of the required aids

  • This occurs for the calibrated score (Nivel), which reflects the estimated student’s knowledge level, and for the dynamic scores based on the inverse of the value of the required aids (DS_Inv)

Read more

Summary

Introduction

In educational contexts, assessing students’ cognitive skills and reading processes is central to making informed decisions about the support they require to reach their full potential. In the context of reading development, new findings in cognitive psychology have contributed to the emergence of new explanatory models (Hacker et al, 2009; Thiede et al, 2009) This has happened in parallel with significant advances in measurement procedures, especially in relation to models based on Item Response Theory (IRT), and with a significant increase in the capacity and performance of specialized data analysis and data management software. These changes have contributed to the rise of computer-based adaptive assessment procedures, known as Computerized Adaptive Testing (CAT) (Embretson and Reise, 2000).

Objectives
Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.