Abstract

Studies of educational games often treat them as “black boxes” (Black and Wiliam in Phi Delta Kappan 80: 139–48, 1998; Buckley et al. in Int J LearnTechnol 5:166–190, 2010; Buckley et al. in J Sci Educ Technol 13: 23–41, 2010) and measure their effectiveness by exposing a treatment group of students to the game and comparing their performance on an external assessment to that of a control group taught the same material by some other method. This precludes the possibility of monitoring, evaluating, and reacting to the actions of individual students as they progress through the game. To do that, however, one must know what to look for because superficial measures of success are unlikely to identify unproductive behaviors such as “gaming the system.” (Baker in Philipp Comput J, 2011; Downs et al. in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, USA, 2010) The research reported here advances the ultimate goal of creating educational games that can provide real time, meaningful feedback on the progress of their users, enabling teachers or the game itself to intervene in a timely manner. We present the results of an in-depth analysis of students’ actions in Geniventure, an interactive digital game designed to teach genetics to middle and high school students. Geniventure offers a sequence of challenges of increasing difficulty and records students’ actions as they progress. We analyzed the resulting log files, taking into account not only whether a student achieved a certain goal, but also the quality of the student’s performance on each attempt. Using this information, we quantified students’ performance and correlated it to their learning gain as estimated by scores on identical multiple-choice tests administered before and after exposure to Geniventure. This analysis was performed in classes taught by teachers who had participated in professional development as part of a research project. A two-tailed paired-sample t-test of mean pre-test and post-test scores in these classes indicates a significant positive difference with a large effect size. Multivariate regression analysis of log data finds no correlation between students’ post-test scores and their performance on “practice” challenges that invite experimentation, but a highly significant positive correlation with performance on “assessment” challenges, presented immediately following the practice challenges, that required students to invoke relevant mental models. We repeated this analysis with similar results using a second group of classes led by teachers who implemented Geniventure on their own after the conclusion of, and with no support from, the research project.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call