Abstract

In this paper, we apply the idea of testing games by learning interactions with them that cause unwanted behavior of the game to test the competition entries for some of the scenarios of the 2010 StarCraft AI competition. By extending the previously published macro action concept to include macro action sequences for individual game units, by adjusting the concept to the real-time requirements of StarCraft, and by using macros involving specific abilities of game units, our testing system was able to find either weaknesses or system crashes for all of the competition entries of the chosen scenarios. Additionally, by requiring a minimal margin with respect to surviving units, we were able to clearly identify the weaknesses of the tested AIs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call