Abstract
Smart support systems for the recognition of Activities of Daily Living (ADLs) can help elderly people live independently for longer improving their standard of living. Many machine learning approaches have been proposed lately for Human Activity Recognition (HAR), including elaborated networks that contain convolutional, recurrent, and attentive layers. The ubiquity of wearable devices has provided an increasing amount of time-series data that can be used for such applications in an unobtrusive manner. But there are not many studies on the performance of the attention-based Transformer model in HAR, especially not for complex activities such as ADLs. This work implements and evaluates the novel self-attention Transformer model for the classification of ADLs and compares it to the already well-established approach of recurrent Long-Short Term Memory (LSTM) networks. The proposed method is a two-level hierarchical model, in which atomic activities are initially recognized in the first step and their probability scores are extracted and utilized for the Transformer-based classification of seven more complex ADLs in the second step. The Transformer is used at the second step to classify seven ADLs. Our results show that the Transformer model reaches the same performance and even outperforms LSTM networks cleary in the subject-dependent configuration (73.36 % and 69.09 %), while relying only on attention-mechanism to depict global dependencies between input and output without the need to use any recurrence. The proposed model was tested using two different segment lengths, indicating its effectiveness in learning long-range dependencies of shorter actions in complex activities.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.