Abstract

In this paper, we investigate the allocation of resource in D2D-aided Fog computing system with multiple mobile user equipments (MUEs). We consider each MUE has a request for task from a task library and needs to make a decision on task performing with a selection of three processing modes which include local mode, fog offloading mode, and cloud offloading mode. Two scenarios are considered in this paper, which mean task caching and its optimization in off-peak time, task offloading, and its optimization in immediate time. In particular, task caching refers to cache the completed task application and its related data. In the first scenario, to maximize the average utility of MUEs, a task caching optimization problem is formulated with stochastic theory and is solved by a GA-based task caching algorithm. In the second scenario, to maximize the total utility of system, the task offloading and resource optimization problem is formulated as a mixed integer nonlinear programming problem (MINLP) with a joint consideration of the MUE allocation policy, task offloading policy, and computational resource allocation policy. Due to the nonconvex of the problem, we transform it into multi-MUEs association problem (MMAP) and mixed Fog/Cloud task offloading optimization problem (MFCOOP). The former problem is solved by a Gini coefficient-based MUEs allocation algorithm which can select the most proper MUEs who contribute more to the total utility. The task offloading optimization problem is proved as a potential game and solved by a distributed algorithm with Lagrange multiplier. At last, the simulations show the effectiveness of the proposed scheme with the comparison of other baseline schemes.

Highlights

  • In recent years, the world has witnessed a growing number of intelligent devices and the accompanied wireless data traffic [1], [2]

  • Inspired by the concept of task caching proposed in [35], we further investigate the benefits of task caching for the D2D-aided Fog computing networks

  • SYSTEM MODEL we introduce an D2D-aided Fog computing system model with a hierarchical computing structure which consists of a set of mobile user equipments (MUEs) and Fog notes (FNs)

Read more

Summary

Introduction

The world has witnessed a growing number of intelligent devices and the accompanied wireless data traffic [1], [2]. It is foreseen that the mobile data traffic will increase even more significantly due to the development of the novel sophisticated applications, such as face recognition, interactive gaming and augmented reality [3]. These emerging applications and services need extensive computing capabilities and vast battery energy, and high data rate. To overcome such disadvantages, a new paradigm Fog computing network is proposed which provides cloud services at the edge of the network [4]. By deploying numerous Fog notes (FNs) in the edge network, MUEs can offload their tasks to one of Fog servers or the cloud server, which can reduce the backbone traffic, and decrease the latency for delay sensitive services [7], [8]

Objectives
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.