Abstract

While the task of detecting eating events has been examined in prior work using a variety of wearable devices, the use of the smartphone as a standalone device to infer eating events remains as an open issue. In this paper, we propose a framework that infers eating vs. non-eating events from passive smartphone sensing, and evaluated on a dataset of 58 college students. First, we show that time of the day and features from modalities such as screen usage, accelerometer, app usage, and location are indicative of eating and non-eating. Then, we show that eating events can be inferred with an AUROC (area under the receiver operating characteristics curve) of 0.65 (F1-score of 0.75) using a subject-independent machine learning model, which can be further improved up to 0.81 (F1-score of 0.85 for subject-dependent and 0.81 for hybrid models) using personalization techniques. We also show that users have different behavioral and contextual routines around eating episodes, that require specific feature groups in training fully personalized models. These findings are of potential value for future mobile food diary apps that are context-aware by: (i) enabling scalable sensing-based eating studies using only smartphones; (ii) detecting under-reported eating events, thus increasing data quality in self report-based studies; (iii) providing functionality to track food consumption and generate reminders for on-time collection of food diaries; and (iv) supporting mobile interventions towards healthy eating practices.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call