Abstract
Through creating an environment rich in computational and communication capabilities, ubiquitous computing gradually integrates it with human activities. Inspired by adaptive ubiquitous learning, various intelligent devices (e.g., roadside units and infrared sensors) deployed in the Internet of Vehicles (IoV) are expected to be critical to mitigating urban traffic congestion and enhancing travel safety. In addition, benefiting from the advantages of high mobility and real-time response, Unmanned Aerial Vehicles (UAVs) embody substantial prospects to assist IoV in efficiently and flexibly handling latency-sensitive, computation-intensive tasks. Nevertheless, due to time-varying demands and heterogeneous computing resources, it is challenging to provide effective service for mobile devices while guaranteeing high-quality data transmission. Therefore, a distributed service offloading system framework in UAV-enhanced IoV is designed. To minimize the service latency, a game theory-based distributed service offloading algorithm, named G-DSO, is proposed to realize adaptive ubiquitous learning for service request distribution. Finally, numerous experiments are implemented based on real-world service requirement datasets. Experimental results demonstrate that the proposed G-DSO approach improves the hit rate by 2.68% to 74.42% compared with four existing service offloading methods, verifying the effectiveness and good scalability of G-DSO.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have