Abstract

Fog computing (also known as edge computing) is a decentralized computing architecture that seeks to minimize service latency and average response time in IoT applications by providing compute and network services physically close to end-users. Fog environment consists of a network of fog nodes and IoT applications are composed of containerized microservices communicating with each other. Due to limited resources of fog nodes, it is often not possible to deploy all the containers of an application on a single fog node. Therefore, communicating containers need to be distributed on multiple fog nodes. Distribution and management of containerized IoT applications is always a critical issue to the system performance in a fog environment. Kubernetes, an open-source system, has grown into a container orchestration standard by simplifying the deployment and management of containerized applications. Despite the progress made by the academia and industry with respect to container management and the wide-scale acceptance of Kubernetes in cloud environments, container management in fog environment is still in the early stage in terms of research and practical deployment. This article aims to fill this gap by analyzing the expediency of Kubernetes container orchestration tool in the fog computing model. The paper also highlights limitations with the current Kubernetes approach and provide ideas for further research to adapt to the needs of the fog environment. Lastly, we provide experiments that demonstrate the feasibility and industrial practicality of deploying and managing containerized IoT applications in the fog computing environment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call