Abstract

This paper presents the design and performance of a body-machine-interface (BoMI) system, where a user controls a robotic 3D virtual wheelchair with the signals derived from his/her shoulder and elbow movements. BoMI promotes the perspective that system users should no longer be operators of the engineering design but should be an embedded part of the functional design. This BoMI system has real-time controllability of robotic devices based on user-specific dynamic body response signatures in high-density 52-channel sensor shirt. The BoMI system not only gives access to the user's body signals, but also translates these signals from user's body to the virtual reality device-control space. We have explored the efficiency of this BoMI system in a semi-cylinderic 3D virtual reality system. Experimental studies are conducted to demonstrate, how this transformation of human body signals of multiple degrees of freedom, controls a robotic wheelchair navigation task in a 3D virtual reality environment. We have also presented how machine learning can enhance the interface to adapt towards the degree of freedoms of human body by correcting the errors performed by the user.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.