Abstract

Smartphones and wearable devices are in the frontlines when it comes to the field of Human Activity Recognition (HAR). There have been numerous attempts to use motion sensors in smartphones and wearables to recognize human activity. Most of these studies apply supervised learning techniques, which requires them to use labeled datasets. In this work, we take a sample of these labels, or action primitives (sit, stand, run, walk, jump, lie down), and evaluate them against the resulting labels of several clustering algorithms. We built two datasets (labeled and unlabeled) using accelerometer, gyroscope, and pedometer readings from two fixed-position devices, a smartphone in the side pocket, and a smartwatch strapped onto the left-hand wrist. Ultimately, we want to determine whether these action primitives commonly used in HAR are optimal, and suggest a better set of primitives if not.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call