Abstract

Efficient cooperation of humans and industrial robots is based on a common understanding of the task as well as the perception and understanding of the partner’s action in the next step. In this article, a hybrid assembly station is presented, in which an industrial robot can learn new tasks from worker instructions. The learned task is performed by both the robot and the human worker together in a shared workspace. This workspace is monitored using multi-sensory perception for detecting persons as well as objects. The environmental data are processed within the collision avoidance module to provide safety for persons and equipment. The real-time capable software architecture and the orchestration of the involved modules using a knowledge-based system controller is presented. Finally, the functionality is demonstrated within an experimental cell in a real-world production scenario.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.