Abstract

The essence of human behavior recognition is the problem of locating key points of the human body. The key points of different parts are connected to form an effective limb skeleton, and then the human behavior is determined by the change of the skeleton. However, the positioning of key points of human body has the following problems: the complexity of human body actions leads to many kinds of joint positions, small joint points that are difficult to see, fuzzy or covered joint points. At present, the mainstream algorithms mainly judge the location of key points according to the context. In this paper, it is difficult to detect the human pose of fuzzy and remote micro key points by combining OpenPose network and Yolo network. Firstly, by modifying the CSPdrarknet53 feature extraction network, two different distance feature extraction methods are used. For the close human body, the OpenPose Part Confidence Maps (PCM) and Part Affinity Fields (PAFs) method is used to detect the posture. For the long-distance human body actions that cannot be detected, the PANet enhancement is used and then the NMS is used to filter the candidate frame. Compared with experiments, the algorithm can get more than 95% accuracy through training, which has better real-time performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.