Abstract
The interest of human-robot collaboration (HRC) for intelligent manufacturing service system is gradually increasing. Fluent human-robot coexistence in manufacturing requires accurate estimation of the human motion intention so that the efficiency and safety of HRC can be guaranteed. Human motion is mainly defined as the sequential positions of the joints of human skeletons among traditional motion prediction solutions, which lead to a deficiency of tools or product components holding in hand. Context awareness based temporal processing is the key to evaluating human motion before the accomplishment of it, so as to save time as well as recognize the intention of the human. In this paper, a deep learning system combing convolutional neural network (CNN) and long short-term memory network (LSTM) towards vision signals is explored to predict human motion accurately. Creatively, this paper utilizes LSTM to extract temporal patterns of human motion automatically outputting the prediction result before motion takes place. Not only does it avoid complex feature extraction due to its end-to-end characteristic, but provide a natural interaction between human and robot without wearable devices or tags that may become a burden for the former. A case study of desktop computer product disassembly is executed to demonstrate the feasibility of the recommended method. Experimental performance proves that our method outperforms the other three optimization algorithms on the prediction accuracy.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.