Abstract
Falls are one of the significant causes of accidental injuries to the elderly. With the rapid growth of the elderly population, fall detection has become a critical issue in the medical and healthcare fields. In this paper, we propose a model based on an improved attention mechanism, CBAM-IAM-CNN-BiLSTM, to detect falls of the elderly accurately and in time. The model includes a convolution layer, bidirectional LSTM layer, sampling layer and dense layer, and incorporates the improved convolutional attention block module (CBAM) into the network structure so that the one-dimensional convolution layer replaces the dense layer to aggregate the information from channels, which allows the model to accurately extract different behavior characteristics. The acceleration and angular velocity data of the human body, collected by wearable sensors, are respectively input into the convolution layer and bidirectional LSTM layer of the model and then classified and identified by softmax after feature fusion. Based on comparison with models such as CNN and CNN-BiLSTM, as well as with different attention mechanisms such as squeeze-and-excitation (SE), efficient channel attention (ECA) and the convolutional block attention module (CBAM), this model improves the accuracy, sensitivity and specificity to varying degrees. The experimental results showed that the accuracy, sensitivity and specificity of the CBAM-IAM-CNN-BiLSTM model proposed in this paper were 97.37%, 97.29% and 99.56%, respectively, which proves that the model has good practicability and strong generalization ability.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.