Abstract

Purpose This study aims to develop a real-time algorithm, which can detect people even in arbitrary poses. To cover poor and changing light conditions, it does not rely on color information. The developed method is expected to run on computers with low computational resources so that it can be deployed on autonomous mobile robots. Design/methodology/approach The method is designed to have a people detection pipeline with a series of operations. Efficient point cloud processing steps with a novel head extraction operation provide possible head clusters in the scene. Classification of these clusters using support vector machines results in high speed and robust people detector. Findings The method is implemented on an autonomous mobile robot and results show that it can detect people with a frame rate of 28 Hz and equal error rate of 92 per cent. Also, in various non-standard poses, the detector is still able to classify people effectively. Research limitations/implications The main limitation would be for point clouds similar to head shape causing false positives and disruptive accessories (like large hats) causing false negatives. Still, these can be overcome with sufficient training samples. Practical implications The method can be used in industrial and social mobile applications because of its robustness, low resource needs and low power consumption. Originality/value The paper introduces a novel and efficient technique to detect people in arbitrary poses, with poor light conditions and low computational resources. Solving all these problems in a single and lightweight method makes the study fulfill an important need for collaborative and autonomous mobile robots.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.