Abstract

With the increase of vehicular broadband services, network slicing is regarded as a promising technology to meet the various requirements. In the Internet of Vehicles (IoV) enabled by network slicing, both the resource allocation and task scheduling algorithms have received extensive attentions. However, most of the existing works on the joint optimization of inter-slice resource allocation and intra-slice task scheduling ignore the power consumption. In this regard, this paper further considers the queuing latency and the power consumption in the uplink transmission. Three sub Markov decision processes (MDP) are modeled and a layered deep reinforcement learning (DRL) based algorithm is designed for the joint optimization. The simulation consists of two parts, firstly the cumulative reward of different task scheduling algorithms are compared and analyzed, then the performance of all solutions are evaluated under different task densities. The results show that the proposed algorithm can significantly reduce the queuing latency while the power consumption is comparable to that of second best algorithm.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.