Abstract
Diabetes is both heavily affected by the patients’ lifestyle, and it affects their lifestyle. Most diabetic patients can manage the disease without technological assistance, so we should not burden them with technology unnecessarily, but lifestylemonitoring technology can still be beneficial both for patients and their physicians. Because of that we developed an approach to lifestyle monitoring that uses the smartphone, which most patients already have. The approach consists of three steps. First, a number of features are extracted from the data acquired by smartphone sensors, such as the user’s location from GPS coordinates and visible wi-fi access points, and the physical activity from accelerometer data. Second, several classifiers trained by machine learning are used to recognise the user’s activity, such as work, exercise or eating. And third, these activities are refined by symbolic reasoning encoded in Event Calculus. The approach was trained and tested on five people who recorded their activities for two weeks each. Its classification accuracy was 0.88.
Highlights
According to International Diabetes Federation, 5.6 % of the global population suffer from diabetes, and this figure is increasing [1]
In this paper we presented an approach that combines machine learning and symbolic reasoning to recognise lifestyle activities of diabetic patients using sensor data obtained primarily from the patients’ smartphone
Machine learning was used to deal with the large quantity of difficult-to-interpret sensor data
Summary
According to International Diabetes Federation, 5.6 % of the global population suffer from diabetes, and this figure is increasing [1]. Key activities for diabetic patients are eating and exercise – the former because it puts glucose in the blood and the latter because it speeds up its absorption. They have to monitor and manage these activities very carefully. In this paper we describe an approach to recognise basic lifestyle activities with the sensors built into the smartphone. The key features extracted from sensor data are the user’s location, physical activity and ambient sound. These are fed into a number of classifiers trained with machine learning that output the user’s activity.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: EAI Endorsed Transactions on Pervasive Health and Technology
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.