Abstract
Research interests in individuals' eating habits are growing recently since unhealthy eating habits are highly related to various diseases such as obesity, diabetes, and cardiovascular diseases. Monitoring people’s eating behavior provides opportunities to give feedbacks and suggestions towards healthy eating habits. Body worn inertial sensors are gaining popularity to provide a good solution for eating behavior recognition since they are convenient for subjects to wear in daily life. In this paper, a novel approach of detecting eating gestures by tracking finger motion is proposed. A dataset of 375 gestures, doing 7 activities is created for training and testing classifiers. A Teager energy based algorithm is designed for segmentation of gesture series. Seven state-of-the-art learning methods are tested to make binary (eating/non-eating) or multiclass (seven classes) classification. Accelerometer and gyroscope datasets are tested separately and compared. The results show that for finger motion data, K Nearest Neighbor (KNN) performs best and achieves 97.1% accuracy in binary classification, which performs better than on wrist motion dataset. The results indicate finger motion is an effective indicator for classifying eating and non-eating behaviors.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.