Abstract

PurposeThe purpose of this paper is to extend the usage of stroke gestures in manipulation tasks to make the interaction between human and robot more efficient.Design/methodology/approachIn this paper, a set of stroke gestures is designed for typical manipulation tasks. A gesture recognition and parameter extraction system is proposed to exploit the information in stroke gestures drawn by the users.FindingsThe results show that the designed gesture recognition subsystem can reach a recognition accuracy of 99.00 per cent. The parameter extraction subsystem can successfully extract parameters needed for typical manipulation tasks with a success rate about 86.30 per cent. The system shows an acceptable performance in the experiments.Practical implicationsUsing stroke gesture in manipulation tasks can make the transmission of human intentions to the robots more efficient. The proposed gesture recognition subsystem is based on convolutional neural network which is robust to different input. The parameter extraction subsystem can extract the spatial information encoded in stroke gestures.Originality/valueThe author designs stroke gestures for manipulation tasks which is an extension of the usage of stroke gestures. The proposed gesture recognition and parameter extraction system can make use of stroke gestures to get the type of the task and important parameters for the task simultaneously.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call