Abstract

Upper body kinematics is essential for motor function assessment and robot-assisted rehabilitation training. Wearable sensor systems, such as inertial measurement units (IMUs), provide affordable solutions to replace laboratory-based motion capture systems for use in daily life. However, the sensor-to-segment calibration often relies on predefined posture or movements, which is hard to perform accurately, particularly for patients with a limited range of motion. A visual–inertial sensor system is presented, which includes three sensor modules attached to the trunk, upper arm, and forearm. Each module has an IMU and an ArUco marker, which can be captured by a camera and the driftless orientation of the modules is computed from visual–inertial fusion. The sensor-to-segment transformations are calibrated from a period of arbitrary arm movements in either a 2-D plane or 3-D space, simulating the training process assisted by end-effector robots. Experiments were conducted to validate the feasibility and evaluate the accuracy of the proposed method. The estimated shoulder and elbow joint angles correlated well ( <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$&gt;$ </tex-math></inline-formula> 0.986) with the ground truth from the optical motion capture (OMC) system. The joint angles presented low root-mean-square errors (RMSEs) ( <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$&lt; 4^{\circ }$ </tex-math></inline-formula> ) except for the forearm pronation–supination angle (9.34°), which relied on manual alignment. The sensor system provides a simple and easy-to-use solution for movement assessment during robot-assisted training.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call