In internet of vehicles (IoVs), an edge server can process related computing tasks offloaded by vehicle-mounted terminals at a close distance based on cached application services. However, if the edge server caches all possible application services, the economic benefit of the server will be reduced owing to a large number of traffic service types. Additionally, vehicle-mounted terminals information may be leaked during data transmission. Therefore, we constructed an integrated framework of edge service and federated learning for IoVs. A resource management approach based on deep reinforcement learning (DRL) is also proposed for the joint optimisation of task offloading decisions, communication and computing resource allocation, as well as edge service caching placement in IoVs. The simulation results indicate that the DRL-based approach can overcome the high complexity of a network system and large space of policy selection and has good convergence performance as well as joint optimisation effect.