Abstract

Edge computing can deliver network services with low latency and real-time processing by providing cloud services at the network edge. Edge computing has a number of advantages such as low latency, locality, and network traffic distribution, but the associated resource management has become a significant challenge because of its inherent hierarchical, distributed, and heterogeneous nature. Various cloud-based network services such as crowd sensing, hierarchical deep learning systems, and cloud gaming each have their own traffic patterns and computing requirements. To provide a satisfactory user experience for these services, resource management that comprehensively considers service diversity, client usage patterns, and network performance indicators is required. In this study, an algorithm that simultaneously considers computing resources and network traffic load when deploying servers that provide edge services is proposed. The proposed algorithm generates candidate deployments based on factors that affect traffic load, such as the number of servers, server location, and client mapping according to service characteristics and usage. A final deployment plan is then established using a partial vector bin packing scheme that considers both the generated traffic and computing resources in the network. The proposed algorithm is evaluated using several simulations that consider actual network service and device characteristics.

Highlights

  • Edge computing is a paradigm proposed to address various issues encountered when network services operate based on cloud computing [1,2]

  • Candidate 0 is a case in which three servers located in the nodes to which each client is connected are mapped to the corresponding clients, and there is no backhaul network traffic generated in the communication between the server and client; there is only synchronization traffic on the link between nodes 1 and 3 and the link between nodes 3 and 4

  • The simulation results are analyzed by a comparison with random deployment, the first fit decreasing (FFD) scheme, and the previously proposed method [25]

Read more

Summary

Introduction

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. Information can be delivered to the driver or occupant using augmented reality on a head-up display on the windshield of the vehicle In this scenario, gaze tracking of the driver or passenger, video processing, and object detection may be processed using the computing resources of adjacent edge devices [7,8]. Gaze tracking of the driver or passenger, video processing, and object detection may be processed using the computing resources of adjacent edge devices [7,8] This is possible because the edge computing model has significantly lower latency than the cloud model. In the cloud computing model, cloud games were difficult to realize because of the round-trip time, but in the edge computing environment, the problem of latency is solved, making it a potentially feasible service [11]

Motivations
Contributions
Related Work
Main Algorithms
Basic Components
Phase 1
A The set
Candidate Selection Using Partial VBP
PVBP Algorithm Example
Result
Simulation Results
Simulation Environment
Simulation Results Analysis and Discussion
Conclusions and Future Work
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.