Abstract
With the development of Artificial Intelligence, the intelligent vehicle with diverse functions and complex system architectures brings more computing tasks to vehicles. Due to insufficient local computing resources for vehicles, mobile edge computing is seen as a solution to relieve local computing pressure. In the background of Telematics, when the vehicle offloads the computation task to the edge server, the communication time between the vehicle and the base station will become shorter due to the high-speed movement of the vehicle. If the vehicle leaves the current base station before the computation is completed, the vehicle will not be able to obtain the computation results in time. Therefore, a task offloading scheme based on trajectory prediction in the context of Telematics is proposed to solve the problem of short communication time between vehicles and base stations due to high-speed movement of vehicles. The solution combines Long Short Term Memory and convolutional neural networks to predict the base station the vehicle will pass and the time to reach it, which enables the return of calculation results through the communication between the base stations and enables tasks with larger data volumes to be offloaded to the edge server. After simulation experiments, the results can prove that the scheme proposed in this paper is adapted to the intelligent vehicle environment, shows greater stability in the face of large computational tasks and reduces about 25% task latency compared to the traditional task offloading scheme.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have