Abstract

Human activity recognition (HAR) and monitoring is beneficial for many medical applications, such as eldercare and post-trauma rehabilitation after surgery. HAR models based on smartphone’s accelerometer data could provide a convenient and ubiquitous solution to this problem. However, such models are mostly concerned with identifying basic activities such as ‘stand’/‘walk’ and thus the high-level context such as ‘walk in a queue’ for which a set of specific activities is performed remain unnoticed. Consequently, in this paper, we design a HAR framework that can identify a group of activities (rather than a single basic activity) being performed in a time window, thus, enables us to extract more meaningful information about the subject’s overall context. An algorithm is designed to formulate HAR as a multi-instance multi-label (MIML) learning problem. The procedure of generating feature bags of consecutive activity traces having multiple labels is formulated. In this work, the temporal relationship among activities is exploited to obtain a more comprehensive HAR model. Interestingly, the framework is found to completely/partially identify activity sequences that may not even be present in the training dataset. The framework is implemented and found to be working adequately when tested with real dataset collected from 8 users for 12 different activity combinations. MIML-kNN is found to provide maximum average precision (around 90%) even for an unseen test data-set.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call