Abstract

Aim: The screencast (SC), a 21st century analytics tool, enables the simultaneous recording of audio and video feedback on any digital document, image, or website, and may be used to enhance feedback systems in many educational settings. Although previous findings show that students and teachers have had positive experiences with recorded commentary, this method is still rarely used by teachers in composition classrooms. There are many possible reasons for this, some of which include the accelerated pace at which classroom technology has changed over the past decade, concerns over privacy when new technologies are integrated into the classroom, and the general unease instructors may feel when asked to integrate a new technology system into their established composition pedagogy and response routine. The aim of this study was to replicate previous findings in favor of SC feedback and expand that body of research beyond instructor-to-student SC interactions and into the realm of SC-mediated peer review. Thus, this study seeks to improve on the widespread written peer review practices most common among writing instruction today, practices that tend to produce mediocre learning outcomes and fail to capitalize on 21st century technological innovations to enhance student learning. This research note demonstrates the validity of SC as a valuable writing analytics research tool that has the potential to collect and measure student learning. It also seeks to inspire those who have been reluctant to adopt SC in both digital learning and face-to-face educational environments by providing pragmatic guidance for doing so in ways that simultaneously increase student learning and facilitate a more rigorous and discursive peer-to-peer review process. Problem Formation: While research suggests positive student perceptions related to screencast instructor response, results in peer-to-peer screencast response are mixed. After several successful years of experience in instructor-to-student SC feedback, the author wondered what would happen if she asked students to use screencast technology to mediate peer review. How might students’ attitudes and perceptions impact the use of peer-to-peer screencast technology in the composition classroom? In order to address these questions, the author developed a survey measuring the user reliability of this new SC technology and the student affect and revision initiative it produces. Information Collection: This study extends Anson’s (2016) research and insights by reporting findings from a study of 138 writing students. Survey data was collected during the 2015-2016 academic year at three institutions. At High Point University, the author of this research note asked freshmen composition students in a traditional face-to-face lecture course to conduct a series of peer review sessions (including both traditional written comments and SC comments) over a 16-week semester. Students were surveyed after each peer review experience, and the results form the foundation of this research note’s conclusions. In addition to survey responses, researchers also collected the screencasts exchanged among peer-to-peer interactions within each educational setting. Conclusions: The author provides an in-depth analysis of students’ experiences, perceptions, and attitudes toward giving and receiving screencast feedback, focusing on the impact of this method on student revision initiative in comparison to that of a traditional written feedback system. Some conclusions are also drawn regarding the user reliability and effectiveness of the screencast technology, specifically the free software program known as Jing, a product available through Techsmith.com that enables a streamlined and user-friendly SC interface and cloud storage of all SC recordings through individualized hyperlinks, thereby alleviating concerns regarding student privacy. Directions for Further Research: While this research note provides compelling evidence to support the use of SC in composition classrooms, there are also many opportunities for continued study, particularly within the emerging field of writing analytics. While the actual student-to-student screencasts were collected in this study, they were not analyzed as a qualitative data set, and the researchers relied on self-reported survey data to assess the degree of revision initiative among the students surveyed. The screencasts themselves offer a treasure trove of data, should the researcher have the capability to code that data set or utilize automated natural language processing programs in the future. Perhaps this peer-to-peer SC feedback could be compared to similar corpus analyses of instructor-to-student feedback gathered by other writing analytics scholars. In addition, further research in this area could also collect the student writing itself and track revisions made by students after receiving SC feedback and traditional written feedback from their peers. In this way, researchers would be able to make comparisons between the actual changes made by the student writers, the extent of those changes (surface-level or higher-order revisions), and the student’s perceived degree of revision initiative reported in the survey. To facilitate future research in this area, the author has included teaching resources for those new to screencast technology and analytics.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.