Abstract

Safe collaboration between a robotic and human agent is an important challenge yet to be fully overcome in a manufacturing set-up. Existing strategies, including safety zoning are sub-optimal since they seldom fully exploit the capabilities of the collaborative robot (for repetitive tasks) and highly cognitive tasks (best suited for the operator). The recently released ISO 15066 standard for collaborative robots proposes varying safeguards, including force, speed and distance limiting functions. The latter is particularly attractive as it allows the robotic agent to adapt its operating behaviour in proximity of the operator and in instances likely to lead to safety hazards. This paper discusses strategies explored for implementing dynamic zoning in shared workspaces, considering the input speed/force of the robot as dependent on the distance between human and robot. Two main strategies were modelled, for implementing zoning. The first strategy explored integrating a LiDAR sensor, and utilising LiDAR data to dynamically map the separation distances between the operator and robotic agent. The second strategy explores an experimental setup utilising the Microsoft Kinect V2 sensor for capturing 3D point clouds, and in turn, detecting objects/agents and the proximity distance. In both instances, objects/agents were detected up to a separation distance threshold, considering error sensitivity below values of 0.1 meters. Both use cases were demonstrated using a Yumi robot and form the basis of future work towards dynamic workspace zoning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call