Abstract
To understand human to human dealing accurately, human interaction recognition (HIR) systems require robust feature extraction and selection methods based on vision sensors. In this paper, we have proposed WHITE STAG model to wisely track human interactions using space time methods as well as shape based angular-geometric sequential approaches over full-body silhouettes and skeleton joints, respectively. After feature extraction, feature space is reduced by employing codebook generation and linear discriminant analysis (LDA). Finally, kernel sliding perceptron is used to recognize multiple classes of human interactions. The proposed WHITE STAG model is validated using two publicly available RGB datasets and one self-annotated intensity interactive dataset as novelty. For evaluation, four experiments are performed using leave-one-out and cross validation testing schemes. Our WHITE STAG model and kernel sliding perceptron outperformed the existing well known statistical state-of-the-art methods by achieving a weighted average recognition rate of 87.48% over UT-Interaction, 87.5% over BIT-Interaction and 85.7% over proposed IM-IntensityInteractive7 datasets. The proposed system should be applicable to various multimedia contents and security applications such as surveillance systems, video based learning, medical futurists, service cobots, and interactive gaming.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.