Abstract

Abstract This study investigates a novel approach to conducting formative writing assessment that involves evaluating students' writing skills across three levels of language (word, sentence, and discourse) using automated measures of word choice, syntax, and cohesion. Writing from students in Grades 6 and 8 (n = 240 each) was analyzed using Coh-Metrix. Multigroup confirmatory factor analysis evaluated a hypothesized three factor levels of language model, and multigroup structural equation modeling determined if these factors predicted performance on a state writing achievement test comprised of a Direct Assessment of Writing (DAW) and an Editing and Revising test (ER). Results indicated that a subset of 9 Coh-Metrix measures successfully modeled three latent levels of language factors at each grade level. Results also indicated that the DAW test was predicted by the latent Discourse factor and the ER test was predicted by the latent Discourse and Sentence factors. Findings provide a proof of concept for automated formative assessment using a levels of language framework. Furthermore, although not the primary goal of the study, results may lay the groundwork for new levels of language detection algorithms that could be incorporated within automated writing evaluation software programs to expand automated + teacher assessment and feedback approaches.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.