For mobile robots to operate in an autonomous and safe manner they must be able to adequately perceive their environment despite challenging or unpredictable conditions in their sensory apparatus. Usually, this is addressed through ad-hoc, not easily generalizable Fault Detection and Diagnosis (FDD) approaches. In this work, we leverage Bayesian Networks (BNs) to propose a novel probabilistic inference architecture that provides generality, rigorous inferences and real-time performance for the detection, diagnosis and recovery of diverse and multiple sensory failures in robotic systems. Our proposal achieves all these goals by structuring a BN in a multidimensional setting that up to our knowledge deals coherently and rigorously for the first time with the following issues: modeling of complex interactions among the components of the system, including sensors, anomaly detection and recovery; representation of sensory information and other kinds of knowledge at different levels of cognitive abstraction; and management of the temporal evolution of sensory behavior. Real-time performance is achieved through the compilation of these BNs into feedforward neural networks. Our proposal has been implemented and tested for mobile robot navigation in environments with human presence, a complex task that involves diverse sensor anomalies. The results obtained from both simulated and real experiments prove that our architecture enhances the safety and robustness of robotic operation: among others, the minimum distance to pedestrians, the tracking time and the navigation time all improve statistically in the presence of anomalies, with a diversity of changes in medians ranging from ≃20% to ≃500%.
Read full abstract