Abstract

Video games are complex mental tasks that require mastery of numerous individual and interconnected skills that map onto the component mechanics of a game. Basic skills and mechanics typically build and depend on each other in a nested learning hierarchy, which game designers have modeled as skill chains of skill atoms. For players to optimally learn and enjoy a game, it should introduce skill atoms in the ideal sequence of this hierarchy or chain. Though players learn new skills from any game, instructional games explicitly attempt to guide a player through the acquisition of skills through the careful organization of levels and content. However, game designers typically construct and use hypothetical skill chains based solely on design intent, theory, or personal observation, rather than empirical observation. Additionally, instructional games have begun to incorporate procedural content generation (PCG) into their design since it can offer several advantages over human-authored content including: personalized content, the inability to share answers between students, and endless practice. Existing PCG methods to analyze content typically focus on physical properties of the content in a synchronic point-in-time manner rather than a diachronic view of experience as players obtain new skills over time. A lack of rigorous PCG evaluation techniques makes understanding the effect of PCG on players a challenging task. Currently, no quantitative methods exist to analyze content based on the skills players should learn during gameplay without subjecting players to burdensome playtesting in order to gather data. Even when data is available, these methods struggle to adapt during the design and development of a game as changes may fundamentally alter the game's mechanics and required strategies. In this thesis, we move to understand three critical aspects of PCG in educational games: the effect of PCG on players' gameplay behavior, the formalization of skill chains, and the automated analysis methods of content based on game skills to create level orderings. To evaluate the effect of PCG on players, we develop GrACE, a computer science educational game for middle-school students that incorporates PCG as a key mechanic. Additionally, we implement a mixed-methods approach to analyzing player behavior which sheds light on limits of current PCG metrics which failed to generate an adequate level progression. Next, we address the challenge of formalizing the creation of skill chains through an adapted cognitive task analysis method incorporating player feedback. Included is a critical reflection of our method to determine its strengths and weaknesses. Finally, we address the lack of methods for skill-based automated playtesting of content through the development of StrataBots which encapsulate the suite of player-acquired skills. We initially develop StrataBots for GrACE to improve the game's level progression, but also demonstrate the generalizability of StrataBots with Monte Carlo Tree Search by applying this technique to the analysis of a human computation game called Foldit, thus allowing us to evaluate the difficulty curves of games with a more nuanced view of difficulty based on player understanding and skills.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call