Abstract

We regularly encounter complex activities consisting of basic skills— both conscious and subconscious. Adequately performing these complex activities involves mastering the individual basic skills and having the ability to seamlessly integrate them together. Games are one such example of a complex activity that is difficult to break down into the basic skills required, but engagement in games relies on designers introducing challenges proportionate to a player's skill. Procedurally generated levels cause additional problems since it is hard to estimate level difficulty for a particular player. This proposal suggests a framework for determining the skills necessary to successfully complete a game, creating AI-based bots with those skills to reflect players with the same skills, and identifying and generating optimal orderings of levels to promote learning each skill of a game. The proposed framework will be implemented in three citizen science games—Paradox, Foldit, and Nanocrafter — and one computer science educational game called GrACE.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call