Abstract
It is evident that the field of Human Action Recognition (HAR), which uses time series of sensors, has become a trending field for the scientific community, thanks to the vast evolution and availability of smart devices in daily life. As well as the technological evolution of applications based on AI systems. The exploitation of HAR systems and applications in various sensitive areas such as the health sector and intelligent surveillance. After the tremendous relay of Deep Learning methods applied for natural language processing and speech recognition. The recurrent architectures LSTM and GRU have become the most popular architectures in the automatic extraction of temporal characteristics and the classification of human activities, but the efficiency of this extraction remains differentiated between various human activities carried out such as static activities, and dynamic activities. Our main objective in this work is to establish a hybrid RNN model tested with raw 3D accelerometer data from smartphones. Optimally, our model proposes to take advantage of the advantages of RNN architectures and respond in a more efficient and balanced way to the classification of the different human activities carried out. We used accuracy, precision, and other necessary measures to obtain well-classified and performing results on the comparison between our hybrid model and two other simple RNN models. The results revealed the efficiency of our hybrid model in properly classifying the different usual activities existing in the UCI Heterogeneity HAR dataset.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Similar Papers
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.