Abstract
With recent advances in the field of sensing, it has become possible to build better assistive technologies. This enables the strengthening of eldercare with regard to daily routines and the provision of personalised care to users. For instance, it is possible to detect a person’s behaviour based on wearable or ambient sensors; however, it is difficult for users to wear devices 24/7, as they would have to be recharged regularly because of their energy consumption. Similarly, although cameras have been widely used as ambient sensors, they carry the risk of breaching users’ privacy. This paper presents a novel sensing approach based on deep learning for human activity recognition using a non-wearable ultra-wideband (UWB) radar sensor. UWB sensors protect privacy better than RGB cameras because they do not collect visual data. In this study, UWB sensors were mounted on a mobile robot to monitor and observe subjects from a specific distance (namely, 1.5–2.0 m). Initially, data were collected in a lab environment for five different human activities. Subsequently, the data were used to train a model using the state-of-the-art deep learning approach, namely long short-term memory (LSTM). Conventional training approaches were also tested to validate the superiority of LSTM. As a UWB sensor collects many data points in a single frame, enhanced discriminant analysis was used to reduce the dimensions of the features through application of principal component analysis to the raw dataset, followed by linear discriminant analysis. The enhanced discriminant features were fed into the LSTMs. Finally, the trained model was tested using new inputs. The proposed LSTM-based activity recognition approach performed better than conventional approaches, with an accuracy of 99.6%. We applied 5-fold cross-validation to test our approach. We also validated our approach on publically available dataset. The proposed method can be applied in many prominent fields, including human–robot interaction for various practical applications, such as mobile robots for eldercare.
Highlights
A CCORDING to a 2017 report by the Department of Economic and Social Affairs in the United Nations, the population of older adults is increasing more rapidly than other age groups [1]
Most previous Human activity recognition (HAR) studies have relied on hand-crafted features, which are sometimes difficult to distinguish with sufficient accuracy to classify activities [17]
AND DISCUSSION we describe the experiments performed on the XeThru UWB sensor dataset to recognise various human activities
Summary
Independent living among older adults has become a significant challenge from both social and economic perspectives. A non-contact ambient sensor that has no such privacy issues is the XeThru ultra-wideband (UWB) radar [7]–[9] Sensors, such as UWB radars, can be used for general robot navigation sensing and emergency analysis based on human body movement. During the long COVID-19 pandemic, additional functions, such as disinfection operations and remote detection of elevated body temperature, were performed by Lio. Human activity recognition (HAR) and emergency detection have made significant progress in recent years through machine learning techniques [16]. The CNN-based multi-channel time-series architecture is taskdependent and is characterised by a higher discrimination accuracy for classifying human activities. Sharma et al [29] introduced a channel impulse response based HAR system which can recognize sitting, standing and lying positions
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.