With the emergence of vehicular edge computing (VEC) and electric vehicles (EVs), integrating computation and charging tasks presents challenges due to limited resources and dynamic vehicular networks. This research focuses on the joint optimization of computation offloading and charging scheduling in VEC networks. Specifically, we optimize the offloading factor, charging association variable, and charging rates to minimize the system delay and energy consumption by leveraging the multi-attributes of EVs in both information and energy networks. Considering the dynamic environment, we model the problem as a Markov Decision Process, and use the Multi-Agent Reinforcement Learning (MARL) algorithm MADDPG, with its centralized training and distributed execution mechanisms. Simulation results demonstrate that this approach significantly improves utility while reducing energy consumption and latency.