Abstract

Human Activity Recognition (HAR) from devices like smartphone accelerometers is a fundamental problem in ubiquitous computing. Machine learning based recognition models often perform poorly when applied to new users that were not part of the training data. Previous work has addressed this challenge by personalizing general recognition models to the unique motion pattern of a new user in a static batch setting. They require target user data to be available upfront. The more challenging online setting has received less attention. No samples from the target user are available in advance, but they arrive sequentially. Additionally, the motion pattern of users may change over time. Thus, adapting to new and forgetting old information must be traded off. Finally, the target user should not have to do any work to use the recognition system by, say, labeling any activities. Our work addresses all of these challenges by proposing an unsupervised online domain adaptation algorithm. Both classification and personalization happen continuously and incrementally in real time. Our solution works by aligning the feature distributions of all subjects, be they sources or the target, in hidden neural network layers. To this end, we normalize the input of a layer with user-specific mean and variance statistics. During training, these statistics are computed over user-specific batches. In the online phase, they are estimated incrementally for any new target user.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.