Abstract

As an emerging communication modality, brainwaves can be used to control robots for seamless assembly, especially in noisy environments where voice recognition is not reliable or when an operator is occupied with other tasks and unable to make gestures. This paper investigates human-robot collaborative assembly based on function blocks and driven by brainwaves. Using wavelet transform, brainwaves measured by EEG sensors are converted to time-frequency images and subsequently classified by a convolutional neural network (CNN) as commands to trigger a network of function blocks for assembly actions. The effectiveness of the system is experimentally validated through an engine-assembly case study.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call