Abstract

This systematic review investigates various evaluation tools for digital educational games and answers the question of which evaluation tools could be used to evaluate digital educational games. A systematic review of studies, by searching for related keywords in the title, abstract, and keywords of studies in the scientific databases EMBASE, Web of Science, Scopus, and PubMed, was launched without time-limited on November 2, 2021. The same checklist was used to extract data such as reference, first author's name, year of publication, tool name, type of tool, instructional strategy, and evaluation factors. A total of 3516 articles were extracted and finally, an analysis of the included studies gave us 22 different approaches to the systematic evaluation of educational games. The same study developed some proprietary evaluation tools exclusively for game evaluation. However, some tools evaluated games in different dimensions, most of which did not consider the tool's validity. In the same sense, we have five prominent evaluation guidelines, including E-GESS, MEEGA+, EGameFlow, HEP, and Kato evaluation guideline, all of which have been developed by explicitly decomposing the evaluation objectives into criteria and using a questionnaire assessed through a collection of case studies. Our systematic review showed the need to identify more consistent and uniform patterns in different dimensions for the systematic evaluation of digital educational games to achieve valid results that can be used as a basis for deciding on the use of digital educational games.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call