Abstract

State-of-the-art in-home activity recognition schemes with wearable devices are mostly capable of detecting coarse-grained activities (sitting, standing, walking, or lying down), but can't distinguish complex activities (sitting on the floor versus the sofa or bed). Such schemes often aren't effective for emerging critical healthcare applications -- for example, in remote monitoring of patients with Alzheimer's disease, bulimia, or anorexia -- because they require a more comprehensive, contextual, and fine-grained recognition of complex daily user activities. Here, a novel approach for in-home, fine-grained activity recognition uses multimodal wearable sensors on multiple body positions, along with lightly deployed Bluetooth beacons in the environment. In particular, this solution exploits measuring user's ambient environment and location context with wearable sensing and Bluetooth beacons, along with user movement captured with accelerometer and gyroscope sensors. The proposed algorithm is a two-level supervised classifier with both levels running on a server. In the first level, multisensor data from wearables on each body position are collected and analyzed using the proposed modified conditional random field (CRF)-based supervised activity classifier. The classified activity state from each of the wearables data are then fused for deciding the user's final activity state. Preliminary experimental results are presented on the classification of 19 complex daily activities of a user at home.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call