Abstract

Human activity recognition (HAR) plays a crucial role in assisting the elderly and individuals with vascular dementia by providing support and monitoring for their daily activities. This paper presents a deep learning (DL)-based approach to HAR, leveraging convolutional neural network (CNN), convolutional long short-term memory (ConvLSTM), and long-term recurrent convolutional network (LRCN) architectures. These models are designed to extract spatial features and capture temporal dependencies in video data, enhancing the accuracy of activity classification. We conducted experiments on the UCF50 and HMDB51 video datasets, encompassing diverse human activities. Our evaluation demonstrates that the ConvLSTM model achieves an accuracy of 82% on UCF50 and 68% on HMDB51, while the LRCN model gives accuracies of 93.44% and 71.55%, respectively. Finally, the CNN model outperforms with an accuracy rate of 99.58% for the UCF50 and 92.70% for the HMDB51 datasets. These significant improvements showcase the effectiveness of integrating convolutional and recurrent neural networks for HAR tasks. Our research contributes to advancing HAR systems with potential applications in healthcare, assisted living, and surveillance. By accurately recognizing human activities, our models can assist in remote patient monitoring, fall detection, and public safety initiatives. These findings underscore the importance of DL in enhancing the quality of life and safety for individuals in various contexts.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.