Abstract

SummaryMobile phones are equipped with a rich set of sensors, such as accelerometers, magnetometers, gyroscopes, photometers, orientation sensors, and gravity sensors. These sensors can be used for human activity recognition in the ubiquitous computing domain. Most of reported studies consider acceleration signals that are collected from a known fixed device location and orientation. This paper describes how more accurate results of basic activity recognition can be achieved with transformed accelerometer data. Based on the rotation matrix (Euler Angle Conversion) derived from the orientation angles of gyroscopes and orientation sensors, we transform input signals into a reference coordinate system. The advantage of the transformation is that it allows activity classification and recognition to be carried out independent of the orientation of sensors. We consider five user activities: staying, walking, running, ascending stairs, and descending stairs, with a phone being placed in the subject's hand, or in pants pocket, or in a handbag. The results show that an overall orientation independent accuracy of 84.77% is achieved, which is a improvement of 17.26% over those classifications without input transformation. Copyright © 2014 John Wiley & Sons, Ltd.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call