Abstract

Clinical mobility assessment is traditionally performed in laboratories using complex and expensive equipment. The low accessibility to such equipment, combined with the emerging trend to assess mobility in a free-living environment, creates a need for body-worn sensors (e.g., inertial measurement units—IMUs) that are capable of measuring the complexity in motor performance using meaningful measurements, such as joint orientation. However, accuracy of joint orientation estimates using IMUs may be affected by environment, the joint tracked, type of motion performed and velocity. This study investigates a quality control (QC) process to assess the quality of orientation data based on features extracted from the raw inertial sensors’ signals. Joint orientation (trunk, hip, knee, ankle) of twenty participants was acquired by an optical motion capture system and IMUs during a variety of tasks (sit, sit-to-stand transition, walking, turning) performed under varying conditions (speed, environment). An artificial neural network was used to classify good and bad sequences of joint orientation with a sensitivity and a specificity above 83%. This study confirms the possibility to perform QC on IMU joint orientation data based on raw signal features. This innovative QC approach may be of particular interest in a big data context, such as for remote-monitoring of patients’ mobility.

Highlights

  • Advances in wearable sensor technology offers unique opportunities to clinicians and researchers to develop field-based approaches to remotely capture outcome measures traditionally studied under laboratory conditions

  • The current study aims at investigating the use of a quality control (QC) algorithm, here, an artificial neural network, to automatically provide feedback on the quality of inertial measurement units (IMU) joint orientation data acquired during a multitude of tasks, without further knowledge of the task performed or the joint tracked

  • This paper aims at (1) developing a simple set of features based on IMU raw signals to characterize data segments; (2) verifying the ability of this set of features to discriminate between good and bad joint orientation estimates when fed into an artificial neural networks (ANNs); and (3) evaluating the impact of such autonomous QC and clean-up processes on joint orientation estimate accuracy in a variety of tasks

Read more

Summary

Introduction

Advances in wearable sensor technology offers unique opportunities to clinicians and researchers to develop field-based approaches to remotely capture outcome measures traditionally studied under laboratory conditions. Remote patient monitoring with wearable sensors can be used to enhance and personalize a patient’s medical follow-up. Remote monitoring of a patient’s mobility and physical functioning may enable early detection of symptoms related to specific neurological disorders, permitting rapid and personalized intervention. Amongst those wearable sensors, inertial measurement units (IMU) stand out as a promising option for remote patient monitoring. An IMU is a platform that typically incorporates accelerometers, gyroscopes and magnetometers to measure linear acceleration, angular velocity and magnetic field, respectively. Combined with a fusion algorithm, the IMU becomes an attitude and heading reference system (AHRS) that estimates the orientation of the platform in a global reference frame based on gravity and magnetic north.

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.