Vehicular Fog Computing (VFC) has been envisioned as a potential fog computing paradigm which aims to offload delay sensitive tasks to mobile fog vehicles instead of remote cloud in order to facilitate computational demands of smart villages close to rural highways. There exists challenges related to task offloading in VFC that need to be addressed. Most often, Road Side Units (RSUs) deployed along rural highways are energy constrained and they need to provide energy efficient scheduling services for the allocation of tasks to fog vehicles. On the other hand, energy consumption optimization is challenging, since scheduling decision of local processing of tasks incur computation cost while the allocation of tasks to fog vehicles incurs communication cost. Although the task offloading to VFC reduces response latency, it leads to higher RSU energy consumption contributed by the communication of task data to fog vehicles. Therefore, this paper presents an energy efficient vehicle scheduling problem for offloading of tasks to mobile fog nodes subject to satisfy constraints of task deadline and resource availability. To resolve high dimensionality issue caused by increased number of vehicles in RSU coverage, we propose an on-policy reinforcement leaning based scheduling algorithm combined with fuzzy logic based greedy heuristic, named as Fuzzy Reinforcement Learning (FRL). This greedy heuristic not only accelerates learning process, but also improves long term reward when compared to Q-learning algorithm. Extensive experiments have been performed to evaluate the proposed algorithm and the simulation results show that the proposed FRL algorithm outperforms other scheduling algorithms such as First Come First Serve (FCFS), Rate Monotonic Scheduling (RMS), Fuzzy and Distributed Task Allocation with Distributed Process (DTA_DP).
Read full abstract