Abstract

It is crucial to increase the independence of severely disabled individuals. Assistive robotics can aid in the desired activities of daily living, including tasks requiring remote performance e.g. grasping remote objects, turning switches on/off, and opening/closing doors. The robot control is compromised by the lack of efficient interfaces for individuals with disabilities and the lack of depth perception. This paper addresses these challenges by presenting the development and evaluation of efficient tongue-based robot interfaces and low-level robot control schemes targeting tele-robotic control through a 2D display. Ten able-bodied participants were successful in completing ten rounds of controlling a JACO robot to perform a pouring water task, using five different control methods, under 2D or 3D visual feedback. The tool-frame based tongue interface layout, L2_TF (with emulated joystick, mode switch button and a “GO” button) improved the 2D visual guided control of the JACO robot compared with the other tongue control methods. The mean trajectory length of completing the task using L2_TF was 3% longer compared with the standard joystick when controlling through 2D. The trajectory length for reaching and grabbing a bottle was shortest for L2_TF compared with all other control methods, including the joystick. The iTongue control layouts performed well in gripping time, showing no significant difference between 2D and 3D. The transition from 2D to 3D resulted in a mean decrease of 27.7% for task completion time across all interfaces. L2_TF and the joystick had the strongest and most similar robustness to the transition between 3D and 2D.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call