Abstract

Human activity recognition is a very active research on pervasive computing and mobile health application. Many human activity systems based on inertial measurement unit (IMU) sensor data were proposed in the past few years. These systems mainly use IMU sensor placed on he torso and limbs to collect data and utilize supervised machine learning algorithms on sensor data. One main issue of these systems is that wearing multiple on-body IMU sensors may bring inconvenience to users' daily life. The other issue of these exiting methods is that an activity recognition model that is trained on a specific subject does not work well when being applied to predict another subject's activities since IMU activity data always carry information that is specific to the human subject who conducts the activities. In our work, inspired by the principle of domain adaption, we proposed a new deep-learning activity recognition model based on an adversarial network which can remove the subject-specific information within the IMU activity data and extract subject-independent features shared by the data collected on different subjects. We also for the first time use data collected from insole based IMU sensors on 8 participants for 5 common activities to build a new real world human activity dataset which can minimize the inconvenience for users to wear. We conducted experiments with our new real-world dataset. Results show that our subject independent activity recognition model outperforms state-of-art supervised learning techniques and eliminates the effects of individual differences between subjects successfully. The average recognition accuracy under the leave-one-out (L1O) condition achieves 99.0% which is higher than the performance of traditional human activity recognition system based on CNNs.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.