Wearable devices and deep learning methods for Human Activity Recognition (HAR) have attracted a lot of interest because they could change healthcare monitoring. This study presents a CNN-LSTM model to accurately and reliably detect human movements from smartphone sensor data. The proposed model takes advantage of both the strengths of Long Short-Term Memory (LSTM) networks for modeling time and Convolutional Neural Networks (CNNs) for extracting features from space. This enables determining how the input data change over time and space. This study examines whether this method can work and is practical in real-life healthcare settings, focused on uses such as watching patients from distance, caring for the elderly, and therapy. The proposed model was evaluated on publicly accessible standard datasets. Various architectural configurations and hyperparameters were examined to determine their performance. The proposed CNN-LSTM model performed well and has great potential for practical use in activity tracking and environment understanding systems.
Read full abstract