Abstract

Human Activity Recognition (HAR) provides the context for many user-centered personal recommender systems in areas such as healthcare, sports, lifelong learning or home automation. Based on different types of sensors (either camera based, environmental sensors or wearable and mobile sensors) user related data provides the basis to extract movement related features from which the activity that the user is performing can be assessed. Among the different types of sensors, wearable sensors provide a user convenient, non-intrusive, always available alternative that has gained special attention for HAR. Wearable sensors will be a relevant part of the Internet of Things. This paper presents a novel mechanism to detect which particular activity a user is performing based on the data from a single tri-axial accelerometer. A Convolutional Neural Network is used in order to automatically extract the most relevant features to characterize acceleration patterns with inter-activity discrimination capacity. The user anchored coordinate system generating the data from the accelerometer sensor is transformed into a georeferenced coordinate system in order to estimate the horizontal and vertical acceleration components. A sliding window with 50% overlap is used to extract 5 seconds of acceleration data from which a square horizontal-vertical acceleration image is computed. Both monochrome and colored images are generated either by adding the influence of the time evolution of the acceleration series or not in the generated image. The results for both p-fold cross-validation and leave on out approaches are presented using a public dataset. The results outperform by around 8% of those obtained by the authors of the dataset in the case of using a p-fold cross-validation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call