Abstract

BackgroundInsufficient levels of physical activity are associated with increased morbidity and mortality for many non-communicable diseases, as identified by the Chief Medical Officer reports and WHO. However, many physical-activity-related health associations have been based on self-report measurement, which is limited by recall, comprehension, and social desirability bias. Current objective measures, such as the accelerometer, can identify episodes of physical activity. Accelerometers are unable to identify detailed type and context behavioural information. First-person point-of-view images can capture health-related behaviours, and the context in which they occur. Historically, wearable cameras were bespoke devices, with poor battery life, constructed in individual life-logging research groups. However, the SenseCam is wearable camera, able to record a full day's worth of behaviours, which other researchers now have access to. This study investigates the feasibility of the SenseCam wearable camera to objectively categorise the type and context of participants' accelerometer-identified episodes of activity. MethodsA convenience sample of 52 university workers was recruited for this study from the USA (n=37) and New Zealand (n=15). Adults were given an Actical hip-mounted accelerometer (Mini-Mitter, Respironics, Bend, OR, USA) and a SenseCam (Vicon Oxford Metrics Group, UK). The device is a lightweight camera worn via a laynard around the neck. The camera has several sensors including: tri-axial accelerometer, magnetometer, ambient temperature, light level, and passive infrared. Images are captured about once every 20 s, and are based on a change in the aforementioned sensor values. The onboard clocks on the SenseCam and Actical devices were time-synchronised. Participants engaged in free-living activities for 3 days. Accelerometer data were cleaned and episodes of sedentary, lifestyle-light, lifestyle-moderate, and moderate-to-vigorous physical activity (MVPA) were identified with standard algorithms. Accelerometer episodes were manually categorised according to their context and physical-activity-compendium code as identified from time-matched SenseCam images. Findings212 days of footage from 49 participants was analysed. SenseCam images, type and context attributes were coded for 386 randomly selected episodes, taking 63 s (95% CI 41–86) to manually code each episode. 12 categories, and 114 subcategory types, which aligned with the physical activity compendium, were identified. About a fifth of episodes could not be classified, mostly because of participant compliance issues. 311 (81%) episodes could be categorised. Of the coded data, 183 (59%) were outdoors versus 120 (39%) indoors; 104 (33%) episodes were recorded as leisure time activities, 103 (33%) transport, 57 (18%) domestic, and 47 (15%) occupational. 104 (33%) of the exemplar episodes contained direct social interaction and 67 (22%) were in social situations in which the participant was not involved in direct engagement. InterpretationWearable cameras can provide data for objective categorisation of accelerometer-defined episodes of activity in free-living situations. However, these devices need further validation against other measures of behaviour such as direct observation. Future studies should investigate state-of-art computer vision techniques to automatically identify behaviours from SenseCam images. National surveillance systems such as Health Survey for England, Biobank, and National Health and Nutrition Examination Survey could use wearable cameras in a subset of participants to develop more appropriate accelerometer classification algorithms of free-living behaviours. FundingBritish Heart Foundation (grant 021/P&C/Core/2010/HPRG), Irish Health Research Board (MCPD/2010/12), and Microsoft Research PhD Scholarship Programme.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call