Abstract
Internet of Things (IoT) devices (wireless sensors, actuators, computer devices) produce a large volume and variety of data, and the data produced by the IoT devices are transient. To overcome the problem of traditional IoT architecture where data is sent to the cloud for processing, an emerging technology known as fog computing is proposed recently. Fog computing brings storage, computing, and control near the end devices. Fog computing complements the cloud and provides services to the IoT devices. Hence, data used by the IoT devices must be cached at fog nodes to reduce bandwidth utilization and latency. This chapter discusses the utility of data caching at fog nodes. Further, various machine learning techniques can reduce the latency by caching the data near the IoT devices by predicting their future demands. Therefore, this chapter also discusses multiple machine learning techniques that can be used to extract accurate data and predict future requests of IoT devices. The results of caching data at fog nodes by predicting the future demands of the end -users using machine learning are the reduction in the communication cost and the latency to access the data. Hence, using machine learning techniques for the request prediction at fog nodes makes the system more efficient.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.