Low latency is an essential constraint for many emerging Internet of Things (IoT) applications. Cloud-based solutions can impose excessive network and processing latency that might hinder the system's operation. On the other hand, offloading tasks to the edge layer - close to where data is consumed - might be unfeasible in many situations due to the low computational resources of edge devices or because the system utilizes third-party applications. One potential solution is caching data on the edge of the network, thus decreasing the latency of subsequent requests, saving bandwidth, and avoiding redundant computation. At the same time, caching IoT data on the edge has its challenges and features that deviate from the standard methods of network caching, such as the content dynamicity and the power constraints. This paper performs a comprehensive review of the state-of-art in IoT Edge Caching and proposes a novel taxonomy focused on five orthogonal features of edge caching: placement, distance, strategy, metrics, and design. Moreover, we illustrate the characteristics identified in five prevalent use cases for the IoT. Finally, we conduct an in-depth analysis and performance evaluation of edge caching deployed in a real Structural Health Monitoring (SHM) scenario. Results show that different caching strategies decrease the system latency by more than 95 percent.
Read full abstract