Abstract

An exoskeleton haptic interface is developed for functional training in virtual environments. A composite control scheme enables a variety of tasks to be implemented, and a ¿Qt¿ graphics library is used to generate the virtual environment for the haptic interface at the hand and graphical user interfaces for input and telemetry. Inter-process communications convert telemetry from the exoskeleton into motion commands for objects in the virtual environment. A second haptic interface at the upper arm is used to control the elbow orbit self-motion of the arm during tasks. Preliminary results are reviewed for a wall-painting task in which the virtual wall stiffness and viscosity are generated using an admittance controller.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call