Abstract

The wide use of motion sensors in today's smartphones has enabled a range of innovative applications which these sensors are not originally designed for. Human activity recognition and smartphone position detection are two of them. In this paper, we present a system for the joint recognition of human activity and smartphone position. Our study shows that the coordinate transformation approach applied to motion data makes our system robust to smartphone orientation variation. Contrary to popular belief, the simple neural network does provide the accuracy comparable to the deep learning models in our problem. In addition, it suggests that the motion sensor sampling rate is another key factor to the recognition problem.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call