Abstract

Electric Internet of things (EIoT) that integrates 5G and edge computing can provide data transmission and processing guarantee for smart grid. However, computation offloading optimization including joint optimization of server selection and computation resource allocation still faces several challenges such as difficulty in tradeoff balance among various quality of service (QoS) parameters, coupling between server selection and computation resource allocation, and multi-device competition. To address these challenges, we propose an empirical matching-based computation offloading optimization algorithm for 5G and edge computing-integrated EIoT. The optimization objective is to minimize the computation offloading delay by jointly optimizing large timescale server selection and small timescale computation resource allocation. We first model the large timescale server selection problem as a many-to-one matching problem, which can be decoupled from small timescale computation resource allocation by establishing a matching preference list based on empirical performance. Then, the large timescale server selection problem is solved by pricing-based matching with a quota algorithm. Furthermore, based on the obtained suboptimal result of large timescale server selection, the small timescale computation resource allocation problem is subsequently solved by Lagrange dual decomposition, the result of which is used to update large timescale empirical performance. Finally, extensive simulations are carried out to demonstrate the superior performance of the proposed algorithm by comparing it with existing algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call