Abstract
Industrial robots are reliable machines for manufacturing tasks such as assembly, welding, painting and palletizing operations. They have been traditionally programmed by an operator using a teach pendant in a point-to-point scheme with limited sensing capabilities such as industrial vision systems and force/torque sensing. Today, industrial robots can react to environment changes specific to their task domain but are still unable to learn skills to effectively use their current knowledge. The need for such a skill in unstructured environments where knowledge can be acquired and enhanced is desirable so that robots can effectively interact in multimodal real-world scenarios. In this paper, an alternative approach based on Artificial Neural Networks to embed and effectively enhance knowledge into industrial robots working in manufacturing scenarios is reviewed. During learning, the robot uses its sensorial capabilities resembling a human operator to successfully accomplish the requested operation in assembly and welding. Current work, issues and experiments are presented and future work envisaged regarding learning in distributed systems in smart factories involving human-robot interaction.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.