Abstract

Modern service robots are provided with one or more sensors, often including RGB-D cameras, to perceive objects and humans in the environment. This paper proposes a new system for the recognition of human social activities from a continuous stream of RGB-D data. Many of the works until now have succeeded in recognising activities from clipped videos in datasets, but for robotic applications it is important to be able to move to more realistic scenarios in which such activities are not manually selected. For this reason, it is useful to detect the time intervals when humans are performing social activities, the recognition of which can contribute to trigger human-robot interactions or to detect situations of potential danger. The main contributions of this research work include a novel system for the recognition of social activities from continuous RGB-D data, combining temporal segmentation and classification, as well as a model for learning the proximity-based priors of the social activities. A new public dataset with RGB-D videos of social and individual activities is also provided and used for evaluating the proposed solutions. The results show the good performance of the system in recognising social activities from continuous RGB-D data.

Highlights

  • In many applications of service and domestic robots, for example to help customers in a shopping centre or assist elderly people at home, it is important to be able to identify and recognise human activities

  • A Dynamic Bayesian Mixture Model (DBMM) is a probabilistic ensemble of classifiers using a Dynamic Bayesian network (DBN) and a mixture model to fuse the outputs of different classifiers, exploiting temporal information from previous time slices

  • The Bayesian Information Criterion (BIC) penalises the models with higher number of parameters more strongly than the Akaike Information Criterion (AIC), it is more suitable to avoid overfitting

Read more

Summary

Introduction

In many applications of service and domestic robots, for example to help customers in a shopping centre or assist elderly people at home, it is important to be able to identify and recognise human activities. Particular attention has been given to indoor activities for potential application in security, retail and Active & Assisted Living (AAL) scenarios In the latter case, for example, human activity recognition with a domestic robot can be useful to identify potential problems and apply corrective strategies. 4. An extensive experimental analysis, including a comparative study of our social activity classification; Fig. 1 Overview of the social activity recognition system segmenting and classifying interactions from continuous RGB-D skeleton data. 4 introduces the features designed for the detection of interactions and recognition of social activities from RGB-D data; Sect. Since the performance of these two models has been only evaluated individually, their combined performance needs to be assessed to consider using them in robotic applications Compared to those works, the new contributions of this paper are fourfold: 1. The new contributions of this paper are fourfold: 1. a novel framework and full pipeline implementation for recognising social activities in realistic scenario from continuous RGB-D data; 2. an improved method to learn proximity-based priors, based on Gaussian Mixture Models, which are used in the probabilistic classification of social activities; 3. a new public dataset with continuous RGB-D sequences of individual and fully labelled social activities for the evaluation and future comparison of our method;

Classification of Human Activities
Activity Recognition Datasets
Detection of Social Interactions
Feature-Sets
Segmentation Features
Classification Features
Interaction Segmentation
Social Activity Classification
Dynamic Bayesian Mixture Model
Proximity-Based Priors
Combined Model
Social Activity Dataset
Overall System Performance
Analysis of Interaction Segmentation
Analysis of Social Activity Classification
Analysis of Proximity-Based Priors
Comparative Study
Conclusion
Findings
Compliance with ethical standards
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.