Abstract

Latency-aware computation is required for prominent technologies like Internet of Things. Things that are connected in IoT environments generate large volumes of realtime data which are generally processed in the cloud. But, the processing of these generated data entirely on the cloud cannot be considered as an efficient solution for time-sensitive IoT applications. To address this issue, a new paradigm, Fog computing, was proposed. Fog computing essentially extends cloud computing and services to the network edges, thus bringing most of the capabilities of the cloud much closer to where the data source is located. Time series analysis is a class of machine learning algorithms that provides deeper insights about the future. The latest trends consist of using deep learning algorithms for performing time series analysis, which have proved to be more efficient than the traditional statistical approaches. This paper aims to replicate a current day Industrial Internet of Things (IIoT) scenario by simulating a secure fog computing environment in which the edge data centers are authenticated by a Certificate Authority, perform time series analysis over pre-collected dataset using Long Short Term Memory (LSTM) neural networks and forecast the required result, along with an intelligent optimization of the network using an efficacious dynamic task offloading mechanism in a secure way.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.