Abstract

Recognition of activities by the use of the sensor is one of the context-conscious studies fascinated by many research teams. This paper explains how different types of physical activities are recognized using cell phone’s motion sensors data. Three types of motion sensors (accelerometer, gravity, linear acceleration) data were used for this work. An effective and efficient deep neural network model was proposed for this work that is able to recognize human physical activities from tri-axial motion sensor data in real-time which was later implemented on android-based smartphones using android studio. The sequential model was implemented for this work where LSTM, flatten and the dense layer was added. The model is able to classify seven types of human activities in the real-time data feed. The model achieves 98.8% of classification accuracy while training and testing. Later on, the model has been converted to a smartphone-compatible model using Tensorflow, as the initial deep learning model isn’t compatible to insert into smartphones. The model was successfully converted and implemented to an android based smartphone.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call