The most basic requirement for autonomous vehicles is accurate environmental perception. However, the lack of onboard computing resources makes it difficult for such a vehicle to process all environmental information. Through the edge computing technology and 6G networks, vehicles can offload computing tasks to edge servers for execution, which alleviates the problem of insufficient onboard resources. The environmental perception task offloading problem for autonomous vehicles is studied. The purpose is to improve the environmental perception quality through reasonable task offloading. After assigning dynamic priorities to tasks generated by autonomous vehicles, the multi-vehicle and multi-server task offloading process is abstracted as a Markov decision process, and an offloading decision algorithm based on deep reinforcement learning is designed to select edge nodes for task execution to obtain better long-term benefits. The earliest deadline first algorithm is improved to consider the deadline and priority of each task, allowing the edge node to complete more high-priority tasks within the specified time. Experimental results show that the proposed method can guarantee the basic environmental perception requirements of every vehicle, and performs better than existing methods in terms of the total priority of the discarded tasks in each scheduling period and the completion rate of key tasks.