Abstract
Many pervasive computing applications need sensor data streams, which can vary significantly in accuracy. Depending on the application, deriving information (e.g., higher-level context) from low-quality sensor data might lead to wrong decisions or even critical situations. Thus, it is important to control the quality throughout the whole data stream processing, from the raw sensor data up to the derived information, e.g., a complex event. In this paper, we present a uniform meta data model to represent sensor data and information quality at all levels of processing; we show how this meta data model can be integrated in a data stream processing engine to ease the development of quality-aware applications; and we present an approach to learn probability distributions of incoming sensor data which needs no prior knowledge. We demonstrate and evaluate our approach in a real-world scenario.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.