Abstract

Mouse, keyboard and graphical user interfaces are commonly used in the field of human-robot interaction (HRI) for robot control. Although these traditional user interfaces are being accepted as the standard for the majority of computational tasks, their generic natural and interaction styles may not fit well with robot navigation tasks. In our proposed research, we intend to explore alternative UIs that could take the advantage of human innate skills in physical object manipulation and spatial perception to overcome the problems associated with traditional UIs. We suggest the use of tangible user interfaces (TUIs) for HRI applications, especially for one-to-many robot navigation tasks. We hope our proposed idea will give insight on future HRI interface design.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.