Abstract
At present, the human interaction behavior algorithm has received great attention, but how to effectively extract the interactive characteristics of human interaction and establish the complex model between multiple targets is still a difficult problem. Aiming at this problem, we proposed a human interaction recognition method based on local spatial - temporal and global feature. This method treats people who are interacting as a whole. Firstly, foreground human object is extracted by the inter-frame difference. Secondly, the optical flow and space-time points of interest are extracted. Considering the large and high dimension of the feature, the characteristics of each frame image are described by the gradient direction histogram HOG and the optical flow histogram HOF. Then, BP neural network is used to classify each feature of each frame. Considering the limitations of different features for interactive behavior recognition, BP neural network combined with classification fusion is used to obtain the final interaction behavior category. The algorithm tested in the UT interaction dataset, and the experimental results show that this method can recognize the interaction behaviors of hitting, hugging, kicking, pushing and handshaking. The proposed method can be applied to intelligent video surveillance.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.