Abstract

Automatic detection of a falling person based on noncontact sensing is a challenging problem with applications in smart homes for elderly care. In this article, we propose a radar-based fall detection technique based on time-frequency analysis and convolutional neural networks. The time-frequency analysis is performed by applying the short-time Fourier transform to each radar return signal. The resulting spectrograms are converted into binary images, which are fed into the convolutional neural network. The network is trained using labeled examples of fall and nonfall activities. Our method employs high-level feature learning, which distinguishes it from previously studied methods that use heuristic feature extraction. The performance of the proposed method is evaluated by conducting several experiments on a set of radar return signals. We show that our method distinguishes falls from nonfalls with 98.37% precision and 97.82% specificity, while maintaining a low false-alarm rate, which is superior to existing methods. We also show that our proposed method is robust in that it successfully distinguishes falls from nonfalls when trained on subjects in one room, but tested on different subjects in a different room. In the proposed convolutional neural network, the hierarchical features extracted from the radar return signals are the key to understand the fundamental composition of human activities and determine whether or not a fall has occurred during human daily activities. Our method may be extended to other radar-based applications such as apnea detection and gesture detection.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.