Abstract

This article investigates the problem of controlling the speed of robots in collaborative workcells for automated manufacturing. The solution is tailored to robotic cells for cooperative assembly of aircraft fuselage panels, where only structural elements are present and robots and humans can share the same workspace, but no physical contact is allowed, unless it happens at zero robot speed. The proposed approach addresses the problem of satisfying the minimal set of requirements of an industrial human–robot collaboration (HRC) task: precision and reliability of human detection and tracking in the shared workspace; correct robot task execution with minimum cycle time while assuring safety for human operators. These requirements are often conflicting with each other. The former does not only concern with safety only but also with the need of avoiding unnecessary robot stops or slowdowns in case of false-positive human detection. The latter, according to the current regulations, concerns with the need of computing the minimum protective separation distance between the human operator and the robots by adjusting their speed when dangerous situations happen. This article proposes a novel fuzzy inference approach to control robot speed enforcing safety while maximizing the level of productivity of the robot minimizing cycle time as well. The approach is supported by a sensor fusion algorithm that merges the images acquired from different depth sensors with those obtained from a thermal camera, by using a machine learning approach. The methodology is experimentally validated in two experiments: the first one at a lab-scale and the second one performed on a full-scale robotic workcell for cooperative assembly of aeronautical structural parts. <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">Note to Practitioners</i> —This article discusses a way to handle human safety specifications versus production requirements in collaborative robotized assembly systems. State-of-the-art (SoA) approaches cover only a few aspects of both human detection and robot speed scaling. The present research work proposes a complete pipeline that starts from a robust human tracking algorithm and scales the robot speed in real time. An innovative multimodal perception system composed of two depth cameras and a thermal camera monitors the collaborative workspace. The speed scaling algorithm is optimized to take on different human behaviors during less risky situations or more dangerous ones to guarantee both operator safety and minimum production time with the aim of better profitability and efficiency for collaborative workstations. The algorithm estimates the operator intention for real-time computation of the minimum protective distance according to the current safety regulations. The robot speed is smoothly changed for the psychological advantages of operators, both in the case of single and multiple workers. The result is a complete system, easily implementable on a standard industrial workcell.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call