Abstract
With the advent of the Internet of Vehicles (IoV), drivers are now provided with diverse time-sensitive vehicular services that usually require a large scale of computation. As civilian vehicles are generally insufficient in computational resources, their service requests are offloaded to cloud data centers and edge computing devices (ECDs) with ample computational resources to enhance the quality of service (QoS). However, ECDs are often overloaded with excessive service requests. In addition, as the network conditions and service compositions are complicated and dynamic, the centralized control of ECDs is hard to achieve. To tackle these challenges, a dynamic task offloading method with minority game (MG) in cloud-edge computing, named DOM, is proposed in this paper. Technically, MG is an effective tool with a distributed mechanism which can minimize the dependency on centralized control in resource allocation. In the MG, reinforcement learning (RL) is applied to optimize the distributed decision-making of participants. Finally, with a real-world dataset of IoV services, the effectiveness and adaptability of DOM are evaluated.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.