Abstract

Approximately 37 million falls occur each year worldwide requiring medical attention. Victims are often helpless and not able to call for help, which is a risk for elderly persons living alone. To detect falls at home, several approaches have been proposed. Video cameras are used increasingly. Recently, high accuracy in real-time human pose estimation in videos has been achieved by novel machine learning techniques. In this work, we propose a multi-camera system for video-based fall detection. We augment human pose estimation (OpenPifPaf algorithm) by support for multi-camera and multi-person tracking and a long short-term memory (LSTM) neural network to predict two classes: “Fall” or “No Fall”. From the poses, we extract five temporal and spatial features which are processed by the LSTM. For evaluation of identification and tracking with multiple cameras, we used videos recorded in a smart home (living lab) with two persons walking and interacting. For evaluation of fall detection, we used the UP-Fall Detection dataset and achieve an F1 score of 92.5%. We observed a tendency towards false positive classifications due to lack of activities in publicly available datasets that look similar to falls but stem from normal activity. Moreover, the lack of variation in the activities also results in a higher amount of false positives. This requires the acquisition of more balanced datasets in future work. In conclusion, real-time fall detection from multiple camera inputs and for multiple persons is feasible using a LSTM neural network combined with features obtained via human pose estimation. Source code is available at https://github.com/taufeeque9/HumanFallDetection.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.