Abstract

With more than 300 000 health-related mobile device applications (apps) on the market serving millions of individuals worldwide, some have now reached the sought-after so-called unicorn status. The health apps landscape is rapidly growing and so are its promises, not only for public health but also for investors' pockets. Calm—the world's first mental health unicorn—is valued at US$1 billion, and now serves more than 1 million people. Health apps, like Calm, could be useful for treating or managing mental health disorders, which are often subject to stigmatisation, inhibiting patients from seeking face-to-face care. In low-income regions, where smartphone use is on the rise and access to health care remains scarce, apps such as Matibabu, a non-invasive malaria diagnostic test, have the potential to transform health care by reducing the burden on overstretched doctors. However, there is huge disparity between the number of apps with evidence of clinical efficacy in the form of randomised clinical trial data, and the apps currently available to the general population. In npj Digital Medicine, Mark Erik Larsen and colleagues reported that only one of the 73 top ranked mental health apps on Google Play and iTunes included a citation to published literature, indicating that high-quality clinical evidence is not commonly described in available mental health apps. Furthermore, the BMJ recently published a study reporting that health apps might be routinely sharing data, which could compromise the privacy of app users. Privacy risks associated with health apps have recently been reported by The Washington Post with regard to the family planning app, Ovia, which might have shared personal data in an aggregate form with users' employers and health insurers. The Wall Street Journal also reported that Facebook had access to sensitive data from 11 of 70 of the most popular apps in iOS, including ovulation tracker Flo Health and biometric tracker Azumio, which could leave individuals susceptible to security risks. These examples show that health apps are not held to the same efficacy or privacy standards as other medical devices. In this issue of The Lancet Digital Health, we publish three Comments that call upon app developers, patients, health-care practitioners, and governing bodies to advance the efficacy of health apps, while ensuring the safety and privacy of users. Philip Henson and colleagues present a novel approach to personalising the evaluation of mental health apps based on the clinical needs of the user, thereby ensuring an informed decision. The authors harmonised almost 1000 app evaluation questions to build a framework to help app users to identify credibility, privacy and security, evidence, ease of use, and medical utility of mental health apps. These guidelines aim to assist patients in choosing the appropriate app for their needs and help health-care practitioners advise on appropriate apps. Simon Leigh and Liz Ashall-Payne take an optimistic view on the advances in use of digital health apps, stating that current infrastructure in the UK is able to support successful adoption of health apps. However, they suggest that more should be done to provide accreditation of health apps from trusted health-care associations (eg, the UK National Health Service), to ensure support of health-care professionals, which is crucial for the promotion and successful implementation of health apps in routine patient care. Agata Ferretti and colleagues compare national guidelines in nine Organisation for Economic Co-operation and Development countries, the EU commission, and WHO to identify gaps in guidance for app developers. They found that cross-border guidance is not comprehensive, making compliance difficult and accountability unclear, particularly for data sharing and privacy, and emphasise that greater collaboration between countries is necessary for better governance of health app development. Greater clinical evidence from randomised trials and improved frameworks are necessary to empower users and health-care professionals when using health apps. Efficacy and safety of these apps can only be guaranteed when more coherent global guidance and governance are provided to support app developers, patients, and health-care practitioners. Governing bodies must act now to support widespread adoption of much needed digital interventions that serve the public and relieve the burden on health-care systems. For more on Calm app valuation and use see https://www.bloomberg.com/technologyFor the npj Digital Medicine evaluation of mental health apps see https://www.nature.com/articles/s41746-019-0093-1#Sec10For the BMJ study of data sharing practices of medicines-related apps see https://www.bmj.com/content/364/bmj.l920For more on privacy risks associated with health apps see https://www.washingtonpost.com/technology/2019/04/10/tracking-your-pregnancy-an-app-may-be-more-public-than-you-think/?utm_term=.26e5d9efea93For more on apps sharing data with Facebook see https://www.wsj.com/articles/you-give-apps-sensitive-personal-information-then-they-tell-facebook-11550851636?mod=e2tw For more on Calm app valuation and use see https://www.bloomberg.com/technology For the npj Digital Medicine evaluation of mental health apps see https://www.nature.com/articles/s41746-019-0093-1#Sec10 For the BMJ study of data sharing practices of medicines-related apps see https://www.bmj.com/content/364/bmj.l920 For more on privacy risks associated with health apps see https://www.washingtonpost.com/technology/2019/04/10/tracking-your-pregnancy-an-app-may-be-more-public-than-you-think/?utm_term=.26e5d9efea93 For more on apps sharing data with Facebook see https://www.wsj.com/articles/you-give-apps-sensitive-personal-information-then-they-tell-facebook-11550851636?mod=e2tw

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call