Abstract

Human activity recognition has many potential applications. In an aged care facility, it is crucial to monitor elderly patients and assist them in the case of falls or other needs. Wearable devices can be used for such a purpose. However, most of them have been proven to be obtrusive, and patients reluctate or forget to wear them. In this study, we used infrared technology to recognize certain human activities including sitting, standing, walking, laying in bed, laying down, and falling. We evaluated a system consisting of two 24×32 thermal array sensors. One infrared sensor was installed on side and another one was installed on the ceiling of an experimental room capturing the same scene. We chose side and overhead mounts to compare the performance of classifiers. We used our prototypes to collect data from healthy young volunteers while performing eight different scenarios. After that, we converted data coming from the sensors into images and applied a supervised deep learning approach. The scene was captured by a visible camera and the video from the visible camera was used as the ground truth. The deep learning network consisted of a convolutional neural network which automatically extracted features from infrared images. Overall average F1-score of all classes for the side mount was 0.9044 and for the overhead mount was 0.8893. Overall average accuracy of all classes for the side mount was 96.65% and for the overhead mount was 95.77%. Our results suggested that our infrared-based method not only could unobtrusively recognize human activities but also was reasonably accurate.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call