Abstract

In internet of vehicles (IoVs), an edge server can process related computing tasks offloaded by vehicle-mounted terminals at a close distance based on cached application services. However, if the edge server caches all possible application services, the economic benefit of the server will be reduced owing to a large number of traffic service types. Additionally, vehicle-mounted terminals information may be leaked during data transmission. Therefore, we constructed an integrated framework of edge service and federated learning for IoVs. A resource management approach based on deep reinforcement learning (DRL) is also proposed for the joint optimisation of task offloading decisions, communication and computing resource allocation, as well as edge service caching placement in IoVs. The simulation results indicate that the DRL-based approach can overcome the high complexity of a network system and large space of policy selection and has good convergence performance as well as joint optimisation effect.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.