Abstract

Abstract Introduction For people with heart failure (HF), self-management (e.g., adhering to prescribed medication, management of fluid restrictions and daily weighing) and dietary management is critical for disease management. The rise of technologies (mobile phones, wearable cameras) for healthcare use offers potential support for people to better manage their disease. We aimed to test the feasibility and utility of wearable cameras for identifying self-management practices and to determine if these images can be used to enhance self-management in people with HF. Methods Participants wore the “narrative clip”, a small wearable camera for one month during waking hours; still images were taken every 30 seconds. Using state-of-the-art artificial intelligence techniques, we investigated automated image analysis of daily life activities to determine the potential of these systems to identify four categories of human activities: medication management, dietary intake, meal preparation and physical activity. Participants also completed a semi-structured questionnaire about acceptability and feasibility. Results 30 participants (mean age 73.6 years, 60% male) with HF NYHA Class II-III were recruited. A total of 629,603 images were available for analysis. Higher order analyses were conducted to determine precision of identifying correct images for the pre-defined self-management categories. Precision of identifying the correct images was highest in dietary intake (average 49%, range 13–94%) followed by meal preparation (average 40%, range 1–99%) and physical activity (average 31%, range 0–95%). Medication management had the lowest precision (average 6%, range 6–22%). Manual review of images revealed substantial periods of sedentary time, typically paired with screen time (watching television, playing cards on a computer). All participants agreed the camera was easy to use and felt they had privacy when using the camera. The majority of participants felt comfortable wearing the camera (93%) and thought this technique will help people with HF in the future (93%). Conclusions Images from wearable cameras provided rich contextual data to better understand the lived experiences of people with HF and the device was acceptable to participants. Automated machine learning tools require more annotated data for training to enhance precision, which will be achieved via further annotation, fine-tuning and retraining the data analysis model. Despite these challenges, the data collected can be used as an adjunct to traditional data collection methods such as self-report. Once data analysis techniques are refined, objective data from wearable cameras may also prove useful for nurses to provide tailored education for self-management. Acknowledgement/Funding Heart Foundation Vanguard Grant, Australia (101348)

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call