AbstractRecent developments in the Internet of Things (IoT) and real‐time applications, have led to the unprecedented growth in the connected devices and their generated data. Traditionally, this sensor data is transferred and processed at the cloud, and the control signals are sent back to the relevant actuators, as part of the IoT applications. This cloud‐centric IoT model, resulted in increased latencies and network load, and compromised privacy. To address these problems, Fog Computing was coined by Cisco in 2012, a decade ago, which utilizes proximal computational resources for processing the sensor data. Ever since its proposal, fog computing has attracted significant attention and the research fraternity focused at addressing different challenges such as fog frameworks, simulators, resource management, placement strategies, quality of service aspects, fog economics and so forth. However, after a decade of research, we still do not see large‐scale deployments of public/private fog networks, which can be utilized in realizing interesting IoT applications. In the literature, we only see pilot case studies and small‐scale testbeds, and utilization of simulators for demonstrating scale of the specified models addressing the respective technical challenges. There are several reasons for this, and most importantly, fog computing did not present a clear business case for the companies and participating individuals yet. This article summarizes the technical, non‐functional, and economic challenges, which have been posing hurdles in adopting fog computing, by consolidating them across different clusters. The article also summarizes the relevant academic and industrial contributions in addressing these challenges and provides future research directions in realizing real‐time fog computing applications, also considering the emerging trends such as federated learning and quantum computing.
Read full abstract