Abstract

Our aim is to build a daily activity surveillance system for elderly people. In this study, we develop Deep Neural Network (RNN) based approach for human activity recognition task by using multi-modal (acoustic and acceleration) signals. In a recent study, the effectiveness of Feed-Forward Neural Network (FF-NN) has been shown for daily activity recognition (DAR) task. However, the length of temporal context to be considered was limited although an actual daily activity event may span over continuous several seconds or minutes. Moreover, from a perspective of practical use, it will be needed to consider adaptation method to obtain satisfactory recognition performance for multiple users even when the only small amount of training data is available for each user. In this study, we evaluate the effectiveness applying Recurrent Neural Network based on Long Short-Term Memory (LSTM-RNN) to DAR. We also evaluate the effectiveness of applying LSTM-RNN with projection layer (LSTMP-RNN) for subject adaptation: (1) by applying RNN instead of FF-NN, much longer temporal context can be considered in daily activity event spanned over several seconds or minutes, and (2) by introducing LSTMP-RNN, an adaptation method can be realized, which can mitigated overfit problem while maintaining the recognition performance. The results of experiments on DAR demonstrated that: (1) applying LSTM-RNN is effective compared to FF-NN, and (2) applying LSTMP-RNN is more effective than LSTM-RNN when limited amount data is available.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.