Abstract
Abstract What happens to writing instructors’ feedback when they use a common rubric and an online tool to respond to student papers in a first-year composition course at a large state university in the United States? To investigate this question, we analyze the 118,611 comments instructors made when responding to 17,433 student essays. Using concordance software to quantify teachers’ use of rubric terms, we found instructors were primarily concerned with global, substantive, higher-order concerns—such as responding to students’ rhetorical situations, use of reason, and organization—rather than lower-order concerns about grammar or formatting. Given past research has determined teachers overemphasize lower-order concerns such as grammar, mechanics, and punctuation ( Connors and Lunsford, 1988 , Lunsford and Lunsford, 2008 , Moxley and Joseph, 1989 , Moxley and Joseph, 1992 , Schwartz, 1984 , Sommers, 1982 , Stern and Solomon, 2006 ), these results may suggest the possibility of a generational shift when it comes to response to student writing. Aggregating teacher commentary, student work, and peer review responses via digital tools and employing concordance software to identify big-data patterns illuminates a new assessment practice for Writing Program Administrators—the practice of Deep Assessment.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.