Abstract

The advancement in technology with multiple sensors embedded in smartphones results in the widespread of smartphones in the applications of human activity analysis and recognition. This promotes a variety of ambient assistive living applications, such as fitness tracking, fall detection, home automation system, healthcare monitoring etc. In this paper, a human activity recognition based on the amalgamation of statistical global features and local deep features is presented. The proposed model adopts temporal convolutional architecture to extract the long-range temporal patterns from the inertial activity signals captured by smartphones. To further enrich the information, statistical features are computed so that the global features of the time series data are encoded. Next, both global and local deep features are combined for classification. The proposed model is evaluated by using WISDM and UCI HAR datasets for user-dependent and independent protocols, respectively, to ensure its feasibility as user-dependent and independent HAR solutions. The obtained empirical results exhibit that the proposed model is outperforming the other existing deep learning models on both user-dependent and independent testing protocols.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.