Abstract
Objective: This paper presents a novel real-time algorithm for fall detection, which contextualizes falls by identifying activities occurring both pre- and post-impact utilizing machine learning techniques and wearable sensors. Methodology: The activities selected to contextualize fall events included standing, lying, walking, running, climbing stairs, and using the elevator. Data were collected using an inertial measurement unit and a barometric altimeter positioned on the participants’ lower backs. Thirteen healthy subjects were observed performing the activities and fall events were recorded from five healthy subjects. The proposed algorithm combines thresholding and cascade support vector machines (SVMs), whose robustness is enhanced by a verification process of the subject’s posture aimed at determining the occurrence of the fall more accurately. Results: The performance of the algorithm was evaluated in terms of the hit rate (HT) both offline and in real-time. From the activities studied, stairs climbing proved to be the most challenging to detect, with an offline HT of 85% and an online HT of 76 %. The overall offline performance was superior, with an HT of 96 %, compared to the performance achieved online, an HT of 91 %; in both cases the fall detection HT was 100 %. Conclusions: The algorithm can be used to recognize fall events occurring to any user, as it has the advantage of not needing prior adaptation due to the nonlinear nature of the SVMs. The cascade SVMs allow for using small sets of variables, leading to low computational cost and a suitable real-time implementation. These features, in addition to the posture verification process, make our algorithm suitable for activity recognition in non-laboratory environments.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.