Abstract

The integration of the Internet of Things (IoT) and cloud environment has led to the creation of Cloud of Things, which has given rise to new challenges in IoT area. In this paper, using the Markov model learning method and calculating the need probability of each object to resources shortly to reduce latency and maximize network utilization, allocating resources in the fog layer has been possible and processed. By using simulations in the CloudSim platform, it is examined the processor productivity for the number of tasks, the workflow overhead for the number of tasks, physical machine’s energy consumption for the number of tasks, the data locality for the number of tasks, resource utilization for the number of tasks, and completion of task for the number of tasks and compared with the SMDP (SemiMarkov decision processes) and MDP methods, results show that the proposed research is effective and promising.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.