Vehicular fog computing (VFC) has become an enticing research hot spot to provide resources for the Internet of Vehicles (IoV) application requests. Moreover, with the massive expansion of services in vehicular fog networks such as augmented reality, 3-D gaming, and autonomous driving, the allocation of fog computing resources to IoV applications constraining the deadline and energy consumption of the vehicles is becoming challenging. In this paper, we investigate a task scheduling problem in vehicular fog networks to jointly optimize the success rate of vehicular tasks and energy consumption of vehicles by considering the moving paths of the vehicles and the social relationships among the vehicles. In this problem, we model the social relationship among the vehicles based on their communication patterns and social characteristics to improve the success rate of the tasks. We also incorporate the mobility of vehicles using the Markov renewal process (MRP) technique. Initially, we formulate a mixed integer nonlinear programming (MINLP) problem for the proposed problem and further prove it to be an NP-hard problem. To solve the proposed problem, we design a federated deep reinforcement learning (FDRL) mechanism by converting the problem into a Markov decision process (MDP) problem. We compare our proposed algorithm with existing scheduling approaches, and simulation results show that the proposed algorithm performs efficiently for task scheduling problems. The proposed algorithm achieved a substantial improvement of 12% in the maximization of the success rate and 36% in the minimization of the energy consumption compared to the existing algorithms.
Read full abstract