Abstract

StarCraft is one of the most successful real-time strategy (RTS) games and is also actively being researched by artificial intelligence (AI) communities. Since 2010, game AI researchers have hosted annual AI competition events to develop human-level RTS AIs using StarCraft. It ranks the AI bots by their winning ratio from thousands of AI vs. AI matches without human involvement. It is questionable whether successful AI bots are also competitive and preferable to human players. In this study, we invited 20 experienced players with varying expertise to evaluate skill levels, overall performance and human likeness of AI bots. Results show that human's ranking of AI bots are not identical to the current one from AI competitions. It suggests the need for developing new AI competitions that consider human factors (human-likeness or adaptation). Also, it revealed that the expertise levels of human players have high impact on overall performance and human-likeness evaluations of AI bots. It supports the concept of dynamically adjusting AI bots to satisfy different levels of human players. The outcomes of this study will also be useful to incorporate human factors in other active video AI competitions (e.g., Angry Birds, Fighting Game, and General Game Playing).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.