Abstract

The number of single households consisting of elderly people is increasing in Japan. We are conducting series of research on the “Biofied Building” that can control the living spaces suitable for each resident's needs. This system utilizes the algorithm learnt from living organisms with the help of small sensor agent robots. In this system, every information in living space is collected by the robots for the space control. The information about the resident's activity is one of the most important data for the “Biofied Building”. When we acquire the information on activities, we have to add tags on the data such as meaning of these activities. For this purpose, we proposed an activity recognition method for residents in living spaces. An activity is defined as an action done aimed at some target, and consists of several movements. A movement is a motion defined by the unit body form. In this study, we defined two types of activities, activities consisting of single movement and activities consisting of multiple movements. For example, the activity of “eating” consisting of multiple movements, such as; “holding chopsticks” and “catching food” etc., is classified into “Multiple type”. On the other hand, the activity of “standing up” consisting of a single movement of “standing up”, is classified into “Single type”. The activities in living space are thus categorized into two. In this study, we used Kinect for Windows by Microsoft to acquire the depth data. After the noise rejection, R transformation or variance, PCA, LDA, and k-means classification were applied to recognize the activities. Our proposed method has two phases, the activity categorization phase, and the activity recognition phase. The ten activities including the six “Single type” activities (“standing up”, “sitting down”, “crunching”, “standing up from crunching”, “picking up something” and “standing up from picking up something”), and the four “Multiple type” activities (“handling a smart phone”, “reading a book”, ”writing” and “eating”) were performed by eleven subjects. We proved that it was effective to categorize the activities into each activity type for activity recognition, because each activity type has own features. We had 100% of mean recognition rate of activity categorization. And we had better mean recognition rates of activity classification with the activity categorization than without it. After the activity categorization phase, the activity classification using R transformation and variance was applied. These two methods were used for transforming two dimensional depth data into one dimensional parameters. R transformation is often used for image analysis in several conventional methods. However, the R transformation for “Multiple type” activities was rarely referred. We applied both methods to “Multiple type” activities and found that they worked well. In addition, we examined the effect of variance for the transformation of two dimensional matrix. As “Multiple type” activities consist of continuous movements in limited space, we hypothesized that variance of several consecutive depth data may be able to detect the features of “Multiple type” activities. As the results, we got 91.4% of mean recognition rate of “Single type” activities in case of using R transformation. And we also got 77.3% of mean recognition rate of “Multiple type” activities in case of using variance. Although some improvement is needed, the proposed method showed a certain level of feasibility for practical applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.