Abstract
This article proposes a novel decentralized two-layered and multi-sensorial based fusion architecture for establishing a novel resilient pose estimation scheme. As it will be presented, the first layer of the fusion architecture considers a set of distributed nodes. All the possible combinations of pose information, appearing from different sensors, are integrated to acquire various possibilities of estimated pose obtained by involving multiple extended Kalman filters. Based on the estimated poses, obtained from the first layer, a Fault Resilient Optimal Information Fusion (FR-OIF) paradigm is introduced in the second layer to provide a trusted pose estimation. The second layer incorporates the output of each node (constructed in the first layer) in a weighted linear combination form, while explicitly accounting for the maximum likelihood fusion criterion. Moreover, in the case of inaccurate measurements, the proposed FR-OIF formulation enables a self resiliency by embedding a built-in fault isolation mechanism. Additionally, the FR-OIF scheme is also able to address accurate localization in the presence of sensor failures or erroneous measurements. To demonstrate the effectiveness of the proposed fusion architecture, extensive experimental studies have been conducted with a micro aerial vehicle, equipped with various onboard pose sensors, such as a 3D lidar, a real-sense camera, an ultra wide band node, and an IMU. The efficiency of the proposed novel framework is extensively evaluated through multiple experimental results, while its superiority is also demonstrated through a comparison with the classical multi-sensorial centralized fusion approach.
Highlights
State estimation is a challenging problem in the field of robotics that has been significantly explored in the recent years and in different scientific and technological oriented communities, such as: robotics [1], aerospace [2], automatic control [3], artificial intelligence [4], and computer vision [5]
To provide a 3D pose of the Micro Aerial Vehicle (MAV) denoted by p Lidar Odometry (LIO), q LIO, (c) the Intel Real-sense T265 visual sensor integrated with visual odometry (VIO) that provides a position and orientation denoted with pV IO, qV IO, and (d) the Ultra-Wideband (UWB) transceivers that provide the position of the MAV, denoted with pUWB
A novel decentralized multi-sensor fusion framework for resilient pose estimation of MAV is presented
Summary
State estimation is a challenging problem in the field of robotics that has been significantly explored in the recent years and in different scientific and technological oriented communities, such as: robotics [1], aerospace [2], automatic control [3], artificial intelligence [4], and computer vision [5]. The second contribution stems from the effectiveness evaluation of the second layer fusion architecture when a sensor is not working correctly for a certain period or when the system receives inaccurate measurements from the sensors In this case, an optimal information filter with fault handling capability is proposed and incorporated to get more robust and accurate responses from the corrupted outcomes. An optimal information filter with fault handling capability is proposed and incorporated to get more robust and accurate responses from the corrupted outcomes In this case, a modified co-variance weight scheme is introduced for combining all possible nodes in the second layer that is utilized to resolve the effect of erroneous measurements. The comparison of these different fusion architectures ha been carried out by using experimentally collected data sets
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have