Abstract

Although existing industrial robots are able to work in challenging environments, accomplish high-precision assignments, as well as help to enhance and increase productivity, most of this are still operated with prebuilt commandos and robot programs. Given the pressure of labor costs and fierce competition, there is a tremendous shortage of autonomous and intelligent robots and cyber-physical systems with the ability of perception and decision in coming Industry 4.0 application. These intelligent robots are able to analyze their tasks by selecting appropriate tools, planning their movements and executing suitable operations, in the same way as a trained human worker would do.In this paper, an industrial robot UR5 is allowed to perceive, locate and interact with different objects such as tools and office supplies. Using a stereo vision camera, it is possible to obtain both RGB- and Depth-data of the robots surrounding and workspace. This data is fed into a Deep Learning Faster-RCNN Network to realize the recognition and localization of present objects out of 50 different classes. With the derived information, proper operations can be planed and executed.The experimental results prove a successful recognition and gripping of reachable objects in the robot’s workspace or suitable feedback for unreachable objects. This work confirms the possibility of using current Deep Learning algorithms in combination with industrial robots to build intelligent systems. The configuration will become increasingly useful for automation and assembly technologies in upcoming Industry 4.0 applications with further developed hardware and software algorithms. Thus representing a significant milestone in the construction of intelligent robots and laying the foundation for future work.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.