In the past decade, network data communication has experienced a rapid growth, which has led to explosive congestion in heterogeneous networks. Moreover, the emerging industrial applications, such as automatic driving put forward higher requirements on both networks and devices. On the contrary, running computation-intensive industrial applications locally are constrained by the limited resources of devices. Correspondingly, fog computing has recently emerged to reduce the congestion of content-centric networks. It has proven to be a good way in industry and traffic for reducing network delay and processing time. In addition, device-to-device offloading is viewed as a promising paradigm to transmit network data in mobile environment, especially for autodriving vehicles. In this paper, jointly taking both the network traffic and computation workload of industrial traffic into consideration, we explore a fundamental tradeoff between energy consumption and service delay when provisioning mobile services in vehicular networks. In particular, when the available resource in mobile vehicles becomes a bottleneck, we propose a novel model to depict the users' willingness of contributing their resources to the public. We then formulate a cost minimization problem by exploiting the framework of Markov decision progress (MDP) and propose the dynamic reinforcement learning scheduling algorithm and the deep dynamic scheduling algorithm to solve the offloading decision problem. By adopting different mobile trajectory traces, we conduct extensive simulations to evaluate the performance of the proposed algorithms. The results show that our proposed algorithms outperform other benchmark schemes in the mobile edge networks.