Abstract

Assessment of human behavior during performance of daily routine actions at indoor areas plays a significant role in healthcare services and smart homes for elderly and disabled people. During this consideration, initially, depth images are captured using depth camera and segment human silhouettes due to color and intensity variation. Features considered spatiotemporal properties and obtained from the human body color joints and depth silhouettes information. Joint displacement and specific-motion features are obtained from human body color joints and side-frame differentiation features are processed based on depth data to improve classification performance. Lastly, recognizer engine is used to recognize different activities. Unlike conventional results that were evaluated using a single dataset, our experimental results have shown state-of-the-art accuracy of 88.9% and 66.70% over two challenging depth datasets. The proposed system should be serviceable with major contributions in different consumer application systems such as smart homes, video surveillance and health monitoring systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call