In Vehicular Edge Computing (VEC) networks, task offloading scheduling has been drawing more and more attention as an effective way to relieve the computational burden of vehicles. However, with the intelligent and networked development of vehicles, the complex data dependency between in-vehicle tasks brings challenges to offloading scheduling. Moreover, scheduling fairness has a growing impact on the average Quality of Service (QoS) of vehicles in the network. To this end, we propose a dependent task offloading scheduling algorithm with fairness constraints based on a back adjustment mechanism. First, to solve the execution constraint problem caused by dependent tasks and the scheduling fairness problem in multi-user scenarios, a two-level task sorting algorithm is given to determine the scheduling sequence of tasks. Then, the sequential task offloading scheduling process is modeled as a Markov Decision Process (MDP) and solved by a reinforcement learning method. Finally, a back adjustment mechanism is designed to resort the task sequence and achieve the required scheduling fairness by iterative process. The simulation results show that the proposed algorithm significantly improves the scheduling fairness and reduces the average application completion time compared with other algorithms.