Abstract
Automated detection of intake gestures with wearable sensors has been a critical area of research for advancing our understanding and ability to intervene in people's eating behavior. Numerous algorithms have been developed and evaluated in terms of accuracy. However, ensuring the system is not only accurate in making predictions but also efficient in doing so is critical for real-world deployment. Despite the growing research on accurate detection of intake gestures using wearables, many of these algorithms are often energy inefficient, impeding on-device deployment for continuous and real-time monitoring of diet. This paper presents a template-based optimized multicenter classifier that enables accurate intake gesture detection while maintaining low-inference time and energy consumption using a wrist-worn accelerometer and gyroscope. We designed an Intake Gesture Counter smartphone application (CountING) and validated the practicality of our algorithm against seven state-of-the-art approaches on three public datasets (In-lab FIC, Clemson, and OREBA). Compared with other methods, we achieved optimal accuracy (81.60% F1 score) and very low inference time (15.97 msec per 2.20-sec data sample) on the Clemson dataset, and among the top performing algorithms, we achieve comparable accuracy (83.0% F1 score compared with 85.6% in the top performing algorithm) but superior inference time (13.8x faster, 33.14 msec per 2.20-sec data sample) on the In-lab FIC dataset and comparable accuracy (83.40% F1 score compared with 88.10% in the top-performing algorithm) but superior inference time (33.9x faster, 16.71 msec inference time per 2.20-sec data sample) on the OREBA dataset. On average, our approach achieved a 25-hour battery lifetime (44% to 52% improvement over state-of-the-art approaches) when tested on a commercial smartwatch for continuous real-time detection. Our approach demonstrates an effective and efficient method, enabling real-time intake gesture detection using wrist-worn devices in longitudinal studies.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.