Abstract

Monitoring of eating behavior using wearable technology is receiving increased attention, driven by the recent advances in wearable devices and mobile phones. One particularly interesting aspect of eating behavior is the monitoring of chewing activity and eating occurrences. There are several chewing sensor types and chewing detection algorithms proposed in the bibliography, however no datasets are publicly available to facilitate evaluation and further research. In this paper, we present a multi-modal dataset of over 60 hours of recordings from 14 participants in semi-free living conditions, collected in the context of the SPLENDID project. The dataset includes raw signals from a photoplethysmography (PPG) sensor and a 3D accelerometer, and a set of extracted features from audio recordings; detailed annotations and ground truth are also provided both at eating event level and at individual chew level. We also provide a baseline evaluation method, and introduce the "challenge" of improving the baseline chewing detection algorithms. The dataset can be downloaded from http: //dx.doi.org/10.17026/dans-zxw-v8gy, and supplementary code can be downloaded from https://github. com/mug-auth/chewing-detection-challenge.git.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call