BackgroundNo systematic evaluation of smartphone/mobile apps for resuscitation training and real incident support is available to date. To provide medical, usability, and additional quality criteria for the development of apps, we conducted a mixed-methods sequential evaluation combining the perspective of medical experts and end-users.ObjectiveThe study aims to assess the quality of current mobile apps for cardiopulmonary resuscitation (CPR) training and real incident support from expert as well as end-user perspective.MethodsTwo independent medical experts evaluated the medical content of CPR apps from the Google Play store and the Apple App store. The evaluation was based on pre-defined minimum medical content requirements according to current Basic Life Support (BLS) guidelines. In a second phase, non-medical end-users tested usability and appeal of the apps that had at least met the minimum requirements. Usability was assessed with the System Usability Scale (SUS); appeal was measured with the self-developed ReactionDeck toolkit.ResultsOut of 61 apps, 46 were included in the experts’ evaluation. A consolidated list of 13 apps resulted for the following layperson evaluation. The interrater reliability was substantial (kappa=.61). Layperson end-users (n=14) had a high interrater reliability (intraclass correlation 1 [ICC1]=.83, P<.001, 95% CI 0.75-0.882 and ICC2=.79, P<.001, 95% CI 0.695-0.869). Their evaluation resulted in a list of 5 recommendable apps.ConclusionsAlthough several apps for resuscitation training and real incident support are available, very few are designed according to current BLS guidelines and offer an acceptable level of usability and hedonic quality for laypersons. The results of this study are intended to optimize the development of CPR mobile apps. The app ranking supports the informed selection of mobile apps for training situations and CPR campaigns as well as for real incident support.
Read full abstract