Abstract

The cloud-fog-edge hybrid system is the evolution of the traditional centralized cloud computing model. Through the combination of different levels of resources, it is able to handle service requests from terminal users with a lower latency. However, it is accompanied by greater uncertainty, unreliability, and instability due to the decentralization and regionalization of service processing, as well as the unreasonable and unfairness in resource allocation, task scheduling, and coordination, caused by the autonomy of node distribution. Therefore, this paper introduces blockchain technology to construct a trust-enabled interaction framework in a cloud-fog-edge environment, and through a double-chain structure, it improves the reliability and verifiability of task processing without a big management overhead. Furthermore, in order to fully consider the reasonability and load balance in service coordination and task scheduling, Berger’s model and the conception of service justice are introduced to perform reasonable matching of tasks and resources. We have developed a trust-based cloud-fog-edge service simulation system based on iFogsim, and through a large number of experiments, the performance of the proposed model is verified in terms of makespan, scheduling success rate, latency, and user satisfaction with some classical scheduling models.

Highlights

  • Based on the idea, this paper designs a cloud-fog-edge hybrid computing architecture, in which the fog layer is a management middleware between edge and cloud, helping the scheduler decide where to deploy and implement a service, to better achieve the resource balancing and low service latency [1]

  • Considering the mobile application scenarios, we divide scheduling into two levels. e first level is user scheduling, which deals with the matching of mobile terminals and access points, and the second one is task scheduling, which realizes the matching of tasks and resources in the local pool

  • E main contributions of this paper are (1) it proposes a novel distributed decentralized trust management model based on blockchain technology; (2) it introduces Berger’s fairness theory to design a preference-based fair task scheduling model for the cloud-fog-edge environments; and (3) it designs a new task scheduling algorithm that comprehensively considers user mobility, load balancing, and trust

Read more

Summary

Related Work

J. Wang et al proposed a recommendation trust evaluation method based on the cloud model and attribute weighted clustering [39]. S. Jian et al proposed a trust-based multiobjective task allocation model for cloud service transactions [43]. The existing solutions cannot achieve full functions in mobile fog computing systems due to the following limitations: (1) the centralized trust framework cannot be accurately integrated with the cloud-fog-edge hybrid systems characterized by node dynamic distributed autonomy and topology loosely coupled, (2) trust crisis of the center node, which leads to single point of failure, (3) huge trust management overhead prevents it from being used in the instant trading scenario, and (4) trust evidence lacks transparency. Erefore, the decentralized trust management model and trust-enabled transaction mechanism for the cloud-fog-edge hybrid environments require further exploration The existing solutions cannot achieve full functions in mobile fog computing systems due to the following limitations: (1) the centralized trust framework cannot be accurately integrated with the cloud-fog-edge hybrid systems characterized by node dynamic distributed autonomy and topology loosely coupled, (2) trust crisis of the center node, which leads to single point of failure, (3) huge trust management overhead prevents it from being used in the instant trading scenario, and (4) trust evidence lacks transparency. erefore, the decentralized trust management model and trust-enabled transaction mechanism for the cloud-fog-edge hybrid environments require further exploration

System Overview
Trust-Based Location-Aware Fair Scheduling Model
Performance Evaluation
Findings
Conclusion and Future
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call