Abstract

Context caching plays an increasingly important role in delivering near real-time responses for context-aware distributed Internet of Things (IoT) applications, services and systems. A context management platform (CMP), a middleware which acts as an aggregator and redirector of contextual information to support smart IoT applications, requires adaptive context caching to process and manage enormous amounts of context stemming from IoT. In this work, we propose a novel approach to estimating the context information’s demand probability, which helps improve the context retrieval performance of a CMP under near real-time constraints. The proposed approach uses context query logs and applies machine learning algorithms to estimate the context caching probability for context caching. We further use an evolutionary technique for optimising the context caching probability to improve the context retrieval performance of the CMP. We conduct an experimental evaluation using a research prototype CMP, Context-as-a-Service (CoaaS) and show that the proposed technique can significantly improve the context retrieval performance. Analysis of the experimental results showed with context caching probability optimized by evolutionary technique there is an average percentage decrease of 43.68% in the response time of CoaaS.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.