Abstract

Human Activity Recognition (HAR), using machine learning to identify times spent (for example) walking, sitting, and standing, is widely used in health and wellness wearable devices, in ambient assistant living devices, and in rehabilitation. In this paper, a stacked Long Short-Term Memory (LSTM) structure is designed for HAR to be implemented on a smartphone. The use of an edge device for the processing means that the raw collected data does not need to be passed to the cloud for processing, mitigating potential bandwidth, power consumption, and privacy concerns. Our offline prototype model achieves 92.8% classification accuracy when classifying 6 activities using a public dataset. Quantization techniques are shown to reduce the model's weight representations to achieve a >30x model size reduction for improved use on a smartphone. The end result is an on-phone HAR model with accuracy of 92.7% and a memory footprint of 27 KB.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.