Abstract

The adoption of multi-sensor data fusion techniques is essential to effectively merge and analyze heterogeneous data collected by multiple sensors, pervasively deployed in a smart environment. Existing literature leverages contextual information in the fusion process, to increase the accuracy of inference and hence decision making in a dynamically changing environment. In this paper, we propose a context-aware, self-optimizing, adaptive system for sensor data fusion, based on a three-tier architecture. Heterogeneous data collected by sensors at the lowest tier are combined by a dynamic Bayesian network at the intermediate tier, which also integrates contextual information to refine the inference process. At the highest tier, a self-optimization process dynamically reconfigures the sensory infrastructure, by sampling a subset of sensors in order to minimize energy consumption and maximize inference accuracy. A Bayesian approach allows to deal with the imprecision of sensory measurements, due to environmental noise and possible hardware malfunctions. The effectiveness of our approach is demonstrated with the application scenario of the user activity recognition in an Ambient Intelligence system managing a smart home environment. Experimental results show that the proposed solution outperforms static approaches for context-aware multi-sensor fusion, achieving substantial energy savings whilst maintaining a high degree of inference accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call