Abstract
Unhealthy dietary structure leads to the prevalence of some chronic diseases, such as obesity, diabetes, and heart disease. Automatic food type recognition helps nutritionists and medical professionals understand patients’ nutritional contents, provide accurate and personalized treatments, and evaluate therapeutic effects. Existing wearable sensor-based methods take advantage of microphone, electromyography (EMG), and piezoelectric sensors embedded in the wearable devices. However, these sensors are either easily impacted by ambient acoustic noise or intrusive and uncomfortable to wear. We observe that each type of food has its own intrinsic properties, such as hardness, elasticity, fracturability, adhesiveness, and size. Different food properties result in different mastication dynamics. In this paper, we present the first effort in using wearable motion sensors to sense mastication dynamics and infer food types accordingly. We specifically define six mastication dynamics parameters to represent these food properties. They are chewing speed, the number of chews, chewing time, chewing force, chewing cycle duration and skull vibration. We embed motion sensors in a headband and deploy the sensors on the temporalis muscles to sense mastication dynamics accurately and less intrusively. In addition, we extract 65 hand-crafted features from each chewing sequence to explicitly characterize the mastication dynamics using motion sensor data. A real-world evaluation dataset of 11 food categories (20 types of food in total) is collected from 15 human subjects. The average recognition accuracy of these 15 human subjects is 82.3%. The accuracy of a single human subject is up to 93.3%.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.