Abstract

Collaborative robotic configurations for monitoring and tracking human targets have attracted interest in the fourth industrial revolution. The fusion of different types of sensors embedded in collaborative robotic systems achieves high-quality information and contributes to significantly improve robotic perception. However, current methods have not deeply explored the capabilities of thermal multisensory configurations in human-oriented tasks. In this paper, we propose thermal multisensor fusion (TMF) for collaborative robots to overcome the limitations of stand-alone robots. Thermal vision helps to utilize the heat signature of the human body for human-oriented tracking. An omnidirectional (O-D) infrared (IR) sensor provides a wide field of view (FOV) to detect human targets, and Stereo IR helps determine the distance of the human target in the oriented direction. The fusion of O-D IR and Stereo IR also creates a multisensor stereo for an additional determination of the distance to the target. The fusion of thermal and O-D sensors brings their limited prediction accuracy with their advantages. The maximum a posteriori method is used to predict the distance of the target with high accuracy by using the distance results of TMF stereo from multiple platforms according to the reliability of the sensors rather than its usage of visible-band-based tracking methods. The proposed method tracks the distance calculation of each sensor instead of target trajectory tracking as in visible-band methods. We proved that TMF increases the perception of robots by offering a wide FOV and provides precise target localization for collaborative robots.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call