Abstract

Recognition of activities by the use of the sensor is one of the context-conscious studies fascinated by many research teams. This paper explains how different types of physical activities are recognized using cell phone’s motion sensors data. Three types of motion sensors (accelerometer, gravity, linear acceleration) data were used for this work. An effective and efficient deep neural network model was proposed for this work that is able to recognize human physical activities from tri-axial motion sensor data in real-time which was later implemented on android-based smartphones using android studio. The sequential model was implemented for this work where LSTM, flatten and the dense layer was added. The model is able to classify seven types of human activities in the real-time data feed. The model achieves 98.8% of classification accuracy while training and testing. Later on, the model has been converted to a smartphone-compatible model using Tensorflow, as the initial deep learning model isn’t compatible to insert into smartphones. The model was successfully converted and implemented to an android based smartphone.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.