Abstract
Fog computing is emerging as a technology that shifts data processing, analytics, storage, and networking closer to the host devices. Fog computing gives IoT systems an advantage, as it enables real-time data processing and decision-making. Sensors in IoT systems collect data that might diminish in value without further computation, making real-time data analysis and processing critical. A cloud-based, centralized approach for data analysis increases latency, which slows the necessary data-driven decision-making process. In an IoT-like time-critical system, fog computing is utilized with machine learning to enable organizations or systems to make data-driven decisions. As sensors in the IoT environment collect colossal amounts of data, transferring all that data into the cloud for processing slows down the process while consuming greater bandwidth and making a case for fog computing stronger. All this data to be processed in the cloud now needs to be processed on the fog nodes. Fog nodes are where the machine learning algorithms come in, which take in the data and give out actionable results. The following sections look at different problems solved using machine learning in fog systems and their approaches. Then, we further list the advantages and disadvantages of using machine learning in fog systems. Finally, we also look at some real-world use cases of machine learning used with fog systems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.