Abstract

PurposeTo find a successful human action recognition system (HAR) for the unmanned environments.Design/methodology/approachThis paper describes the key technology of an efficient HAR system. In this paper, the advancements for three key steps of the HAR system are presented to improve the accuracy of the existing HAR systems. The key steps are feature extraction, feature descriptor and action classification, which are implemented and analyzed. The usage of the implemented HAR system in the self-driving car is summarized. Finally, the results of the HAR system and other existing action recognition systems are compared.FindingsThis paper exhibits the proposed modification and improvements in the HAR system, namely the skeleton-based spatiotemporal interest points (STIP) feature and the improved discriminative sparse descriptor for the identified feature and the linear action classification.Research limitations/implicationsThe experiments are carried out on captured benchmark data sets and need to be analyzed in a real-time environment.Practical implicationsThe middleware support between the proposed HAR system and the self-driven car system provides several other challenging opportunities in research.Social implicationsThe authors’ work provides the way to go a step ahead in machine vision especially in self-driving cars.Originality/valueThe method for extracting the new feature and constructing an improved discriminative sparse feature descriptor has been introduced.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.