Abstract

Information from complementary and redundant sensors are often combined within sensor fusion algorithms to obtain a single accurate observation of the system at hand. However, measurements from each sensor are characterized by uncertainties. When multiple data are fused, it is often unclear how all these uncertainties interact and influence the overall performance of the sensor fusion algorithm. To address this issue, a benchmarking procedure is presented, where simulated and real data are combined in different scenarios in order to quantify how each sensor’s uncertainties influence the accuracy of the final result. The proposed procedure was applied to the estimation of the pelvis orientation using a waist-worn magnetic-inertial measurement unit. Ground-truth data were obtained from a stereophotogrammetric system and used to obtain simulated data. Two Kalman-based sensor fusion algorithms were submitted to the proposed benchmarking procedure. For the considered application, gyroscope uncertainties proved to be the main error source in orientation estimation accuracy for both tested algorithms. Moreover, although different performances were obtained using simulated data, these differences became negligible when real data were considered. The outcome of this evaluation may be useful both to improve the design of new sensor fusion methods and to drive the algorithm tuning process.

Highlights

  • Sensor fusion is a signal processing technique that combines data measured by multiple sources in order to create a single measurement system with an augmented performance over each standalone sensor [1,2]

  • The main contribution of this paper is to propose a novel benchmarking method for the assessment of sensor fusion algorithms (SFAs)’ performance

  • By means of the proposed methodology the following considerations for the MIMU-based human motion tracking can be drawn: (1) the gyroscope errors appear to be the main error source for both the SFAs considered; (2) the processing of accelerometer data proposed in Algorithm 1 is promising because it reduces the detrimental effect of the external acceleration; (3) using the magnetometer data for the heading estimation only leads to more accurate attitude estimates

Read more

Summary

Introduction

Sensor fusion is a signal processing technique that combines data measured by multiple sources in order to create a single measurement system with an augmented performance over each standalone sensor [1,2]. On the contrary, when used alone, these sensors may yield poor results due to different issues characterizing the magnetic and the inertial sensors In this respect MIMU observations are disparate [3], in the sense that part of the orientation information is observed in three different physical domains, i.e., the angular velocity, the specific force, and the Earth magnetic field vector. When sensor observations are fused in an SFA, it is very difficult to assess to which extent each sensor issue influences the final error This information would be crucial to guide the SFA design process (i.e., the choice of different tuning settings or of the adaptive mechanisms to be built in the SFA) or to compare different combinations of sensor hardware components. In estimating the 3D orientation of the pelvis during a Timed Up and Go test was assessed

Methods
The The
Sensor Fusion Algorithms
Overview of the Kalman-based
Measure of Performance
Statistical Analysis
Results
Discussions
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.