Abstract
Detecting human parts at instance-level is an essential prerequisite for the analysis of human keypoints, actions, and attributes. Nonetheless, there is a lack of a large-scale, rich-annotated dataset for human parts detection. We fill in the gap by proposing COCO Human Parts. The proposed dataset is based on the COCO 2017, which is the first instance-level human parts dataset, and contains images of complex scenes and high diversity. For reflecting the diversity of human body in natural scenes, we annotate human parts with (a) location in terms of a bounding-box, (b) various type including face, head, hand, and foot, (c) subordinate relationship between person and human parts, (d) fine-grained classification into right-hand/left-hand and left-foot/right-foot. A lot of higher-level applications and studies can be founded upon COCO Human Parts, such as gesture recognition, face/hand keypoint detection, visual actions, human-object interactions, and virtual reality. There are a total of 268,030 person instances from the 66,808 images, and 2.83 parts per person instance. We provide a statistical analysis of the accuracy of our annotations. In addition, we propose a strong baseline for detecting human parts at instance-level over this dataset in an end-to-end manner, call Hier(archy) R-CNN. It is a simple but effective extension of Mask R-CNN, which can detect human parts of each person instance and predict the subordinate relationship between them. Codes and dataset are publicly available (https://github.com/soeaver/Hier-R-CNN).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.