Abstract

Human Activity Recognition (HAR) is a field that infers human activities from raw time-series signals acquired through embedded sensors of smartphones and wearable devices. It has gained much attraction in various smart home environments, especially to continuously monitor human behaviors in ambient assisted living to provide elderly care and rehabilitation. The system follows various operation modules such as data acquisition, pre-processing to eliminate noise and distortions, feature extraction, feature selection, and classification. Recently, various state-of-the-art techniques have proposed feature extraction and selection techniques classified using traditional Machine learning classifiers. However, most of the techniques use rustic feature extraction processes that are incapable of recognizing complex activities. With the emergence and advancement of high computational resources, Deep Learning techniques are widely used in various HAR systems to retrieve features and classification efficiently. Thus, this review paper focuses on providing profound concise of deep learning techniques used in smartphone and wearable sensor-based recognition systems. The proposed techniques are categorized into conventional and hybrid deep learning models described with its uniqueness, merits, and limitations. The paper also discusses various benchmark datasets used in existing techniques. Finally, the paper lists certain challenges and issues that require future research and improvements.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.